diff --git a/docs/pages/announcements.mdx b/docs/pages/announcements.mdx index 6fec81ca3..2bc55e00a 100644 --- a/docs/pages/announcements.mdx +++ b/docs/pages/announcements.mdx @@ -79,7 +79,7 @@ _If you find `pg` valuable to you or your business please consider [supporting]( After a _very_ long time on my todo list I've ported the docs from my old hand-rolled webapp running on route53 + elb + ec2 + dokku (I know, I went overboard!) to [gatsby](https://www.gatsbyjs.org/) hosted on [netlify](https://www.netlify.com/) which is _so_ much easier to manage. I've released the code at [https://github.com/brianc/node-postgres-docs](https://github.com/brianc/node-postgres-docs) and invite your contributions! Let's make this documentation better together. Any time changes are merged to master on the documentation repo it will automatically deploy. -If you see an error in the docs, big or small, use the "edit on github" button to edit the page & submit a pull request right there. I'll get a new version out ASAP with your changes! If you want to add new pages of documentation open an issue if you need guidance, and I'll help you get started. +If you see an error in the docs, big or small, use the "edit on GitHub" button to edit the page & submit a pull request right there. I'll get a new version out ASAP with your changes! If you want to add new pages of documentation open an issue if you need guidance, and I'll help you get started. I want to extend a special **thank you** to all the [supporters](https://github.com/brianc/node-postgres/blob/master/SPONSORS.md) and [contributors](https://github.com/brianc/node-postgres/graphs/contributors) to the project that have helped keep me going through times of burnout or life "getting in the way." ❤️ @@ -116,7 +116,7 @@ pg@7.1.2 To demonstrate the issue & see if you are vunerable execute the following in node: ```js -const { Client } = require('pg') +import { Client } from 'pg' const client = new Client() client.connect() diff --git a/docs/pages/apis/client.mdx b/docs/pages/apis/client.mdx index d5f335240..340f95c6e 100644 --- a/docs/pages/apis/client.mdx +++ b/docs/pages/apis/client.mdx @@ -29,7 +29,7 @@ type Config = { example to create a client with specific connection information: ```js -const { Client } = require('pg') +import { Client } from 'pg' const client = new Client({ host: 'my.database-server.com', @@ -42,33 +42,13 @@ const client = new Client({ ## client.connect -Calling `client.connect` with a callback: - ```js -const { Client } = require('pg') +import { Client } from 'pg' const client = new Client() -client.connect((err) => { - if (err) { - console.error('connection error', err.stack) - } else { - console.log('connected') - } -}) -``` -Calling `client.connect` without a callback yields a promise: - -```js -const { Client } = require('pg') -const client = new Client() -client - .connect() - .then(() => console.log('connected')) - .catch((err) => console.error('connection error', err.stack)) +await client.connect() ``` -_note: connect returning a promise only available in pg@7.0 or above_ - ## client.query ### QueryConfig @@ -95,77 +75,43 @@ type QueryConfig { } ``` -### callback API - ```ts -client.query(text: string, values?: any[], callback?: (err: Error, result: QueryResult) => void) => void -``` - -**Plain text query with a callback:** - -```js -const { Client } = require('pg') -const client = new Client() -client.connect() -client.query('SELECT NOW()', (err, res) => { - if (err) throw err - console.log(res) - client.end() -}) +client.query(text: string, values?: any[]) => Promise ``` -**Parameterized query with a callback:** +**Plain text query** ```js -const { Client } = require('pg') +import { Client } from 'pg' const client = new Client() -client.connect() -client.query('SELECT $1::text as name', ['brianc'], (err, res) => { - if (err) throw err - console.log(res) - client.end() -}) -``` -### Promise API +await client.connect() -If you call `client.query` with query text and optional parameters but **don't** pass a callback, then you will receive a `Promise` for a query result. +const result = await client.query('SELECT NOW()') +console.log(result) -```ts -client.query(text: string, values?: any[]) => Promise +await client.end() ``` -**Plain text query with a promise** +**Parameterized query** ```js -const { Client } = require('pg') +import { Client } from 'pg' const client = new Client() -client.connect() -client - .query('SELECT NOW()') - .then((result) => console.log(result)) - .catch((e) => console.error(e.stack)) - .then(() => client.end()) -``` -**Parameterized query with a promise** +await client.connect() -```js -const { Client } = require('pg') -const client = new Client() -client.connect() -client - .query('SELECT $1::text as name', ['brianc']) - .then((result) => console.log(result)) - .catch((e) => console.error(e.stack)) - .then(() => client.end()) +const result = await client.query('SELECT $1::text as name', ['brianc']) +console.log(result) + +await client.end() ``` ```ts client.query(config: QueryConfig) => Promise ``` -**client.query with a QueryConfig and a callback** +**client.query with a QueryConfig** If you pass a `name` parameter to the `client.query` method, the client will create a [prepared statement](/features/queries#prepared-statements). @@ -177,34 +123,10 @@ const query = { rowMode: 'array', } -client.query(query, (err, res) => { - if (err) { - console.error(err.stack) - } else { - console.log(res.rows) // ['brianc'] - } -}) -``` - -**client.query with a QueryConfig and a Promise** +const result = await client.query(query) +console.log(result.rows) // ['brianc'] -```js -const query = { - name: 'get-name', - text: 'SELECT $1::text', - values: ['brianc'], - rowMode: 'array', -} - -// promise -client - .query(query) - .then((res) => { - console.log(res.rows) // ['brianc'] - }) - .catch((e) => { - console.error(e.stack) - }) +await client.end() ``` **client.query with a `Submittable`** @@ -212,7 +134,7 @@ client If you pass an object to `client.query` and the object has a `.submit` function on it, the client will pass it's PostgreSQL server connection to the object and delegate query dispatching to the supplied object. This is an advanced feature mostly intended for library authors. It is incidentally also currently how the callback and promise based queries above are handled internally, but this is subject to change. It is also how [pg-cursor](https://github.com/brianc/node-pg-cursor) and [pg-query-stream](https://github.com/brianc/node-pg-query-stream) work. ```js -const Query = require('pg').Query +import { Query } from 'pg' const query = new Query('select $1::text as name', ['brianc']) const result = client.query(query) @@ -222,9 +144,11 @@ assert(query === result) // true query.on('row', (row) => { console.log('row!', row) // { name: 'brianc' } }) + query.on('end', () => { console.log('query done') }) + query.on('error', (err) => { console.error(err.stack) }) @@ -237,25 +161,10 @@ query.on('error', (err) => { Disconnects the client from the PostgreSQL server. ```js -client.end((err) => { - console.log('client has disconnected') - if (err) { - console.log('error during disconnection', err.stack) - } -}) +await client.end() +console.log('client has disconnected') ``` -Calling end without a callback yields a promise: - -```js -client - .end() - .then(() => console.log('client has disconnected')) - .catch((err) => console.error('error during disconnection', err.stack)) -``` - -_note: end returning a promise is only available in pg7.0 and above_ - ## events ### error @@ -264,7 +173,7 @@ _note: end returning a promise is only available in pg7.0 and above_ client.on('error', (err: Error) => void) => void ``` -When the client is in the process of connecting, dispatching a query, or disconnecting it will catch and foward errors from the PostgreSQL server to the respective `client.connect` `client.query` or `client.end` callback/promise; however, the client maintains a long-lived connection to the PostgreSQL back-end and due to network partitions, back-end crashes, fail-overs, etc the client can (and over a long enough time period _will_) eventually be disconnected while it is idle. To handle this you may want to attach an error listener to a client to catch errors. Here's a contrived example: +When the client is in the process of connecting, dispatching a query, or disconnecting it will catch and foward errors from the PostgreSQL server to the respective `client.connect` `client.query` or `client.end` promise; however, the client maintains a long-lived connection to the PostgreSQL back-end and due to network partitions, back-end crashes, fail-overs, etc the client can (and over a long enough time period _will_) eventually be disconnected while it is idle. To handle this you may want to attach an error listener to a client to catch errors. Here's a contrived example: ```js const client = new pg.Client() @@ -301,7 +210,7 @@ type Notification { ```js const client = new pg.Client() -client.connect() +await client.connect() client.query('LISTEN foo') diff --git a/docs/pages/apis/cursor.mdx b/docs/pages/apis/cursor.mdx index c4a6928c7..286e9ca5e 100644 --- a/docs/pages/apis/cursor.mdx +++ b/docs/pages/apis/cursor.mdx @@ -18,8 +18,8 @@ $ npm install pg pg-cursor Instantiates a new Cursor. A cursor is an instance of `Submittable` and should be passed directly to the `client.query` method. ```js -const { Pool } = require('pg') -const Cursor = require('pg-cursor') +import { Pool } from 'pg' +import Cursor from 'pg-cursor' const pool = new Pool() const client = await pool.connect() @@ -48,7 +48,7 @@ type CursorQueryConfig { ## read -### `cursor.read(rowCount: Number, callback: (err: Error, rows: Row[], result: pg.Result) => void) => void` +### `cursor.read(rowCount: Number) => Promise` Read `rowCount` rows from the cursor instance. The callback will be called when the rows are available, loaded into memory, parsed, and converted to JavaScript types. @@ -57,25 +57,22 @@ If the cursor has read to the end of the result sets all subsequent calls to cur Here is an example of reading to the end of a cursor: ```js -const { Pool } = require('pg') -const Cursor = require('pg-cursor') +import { Pool } from 'pg' +import Cursor from 'pg-cursor' const pool = new Pool() const client = await pool.connect() const cursor = client.query(new Cursor('select * from generate_series(0, 5)')) -cursor.read(100, (err, rows) => { - if (err) { - throw err - } - assert(rows.length == 6) - cursor.read(100, (err, rows) => { - assert(rows.length == 0) - }) -}) + +let rows = await cursor.read(100) +assert(rows.length == 6) + +rows = await cursor.read(100) +assert(rows.length == 0) ``` ## close -### `cursor.close(callback: () => void) => void` +### `cursor.close() => Promise` Used to close the cursor early. If you want to stop reading from the cursor before you get all of the rows returned, call this. diff --git a/docs/pages/apis/pool.mdx b/docs/pages/apis/pool.mdx index 6323f2e2d..3cf32b6c4 100644 --- a/docs/pages/apis/pool.mdx +++ b/docs/pages/apis/pool.mdx @@ -48,7 +48,7 @@ type Config = { example to create a new pool with configuration: ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool({ host: 'localhost', @@ -68,33 +68,12 @@ pool.query(text: string, values?: any[]) => Promise ``` ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() -pool - .query('SELECT $1::text as name', ['brianc']) - .then((res) => console.log(res.rows[0].name)) // brianc - .catch((err) => console.error('Error executing query', err.stack)) -``` - -Callbacks are also supported: - -```ts -pool.query(text: string, values?: any[], callback?: (err?: Error, result: pg.Result)) => void -``` - -```js -const { Pool } = require('pg') - -const pool = new Pool() - -pool.query('SELECT $1::text as name', ['brianc'], (err, result) => { - if (err) { - return console.error('Error executing query', err.stack) - } - console.log(result.rows[0].name) // brianc -}) +const result = await pool.query('SELECT $1::text as name', ['brianc']) +console.log(result.rows[0].name) // brianc ``` Notice in the example above there is no need to check out or release a client. The pool is doing the acquiring and releasing internally. I find `pool.query` to be a handy shortcut many situations and use it exclusively unless I need a transaction. @@ -112,7 +91,7 @@ Notice in the example above there is no need to check out or release a client. T ## pool.connect -`pool.connect(callback: (err?: Error, client?: pg.Client, release?: releaseCallback) => void) => void` +`pool.connect() => Promise` Acquires a client from the pool. @@ -121,58 +100,37 @@ Acquires a client from the pool. - If the pool is 'full' and all clients are currently checked out will wait in a FIFO queue until a client becomes available by it being released back to the pool. ```js -const { Pool } = require('pg') - -const pool = new Pool() - -pool.connect((err, client, release) => { - if (err) { - return console.error('Error acquiring client', err.stack) - } - client.query('SELECT NOW()', (err, result) => { - release() - if (err) { - return console.error('Error executing query', err.stack) - } - console.log(result.rows) - }) -}) -``` - -`pool.connect() => Promise` - -```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() -;(async function () { - const client = await pool.connect() - await client.query('SELECT NOW()') - client.release() -})() +const client = await pool.connect() +await client.query('SELECT NOW()') +client.release() ``` ### releasing clients -`release: (err?: Error)` +`client.release(destroy?: boolean) => void` Client instances returned from `pool.connect` will have a `release` method which will release them from the pool. -The `release` method on an acquired client returns it back to the pool. If you pass a truthy value in the `err` position to the callback, instead of releasing the client to the pool, the pool will be instructed to disconnect and destroy this client, leaving a space within itself for a new client. +The `release` method on an acquired client returns it back to the pool. If you pass a truthy value in the `destroy` parameter, instead of releasing the client to the pool, the pool will be instructed to disconnect and destroy this client, leaving a space within itself for a new client. ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() + // check out a single client const client = await pool.connect() + // release the client client.release() ``` ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() assert(pool.totalCount === 0) @@ -184,7 +142,7 @@ assert(pool.totalCount === 1) assert(pool.idleCount === 0) // tell the pool to destroy this client -client.release(true) +await client.release(true) assert(pool.idleCount === 0) assert(pool.totalCount === 0) ``` @@ -205,17 +163,11 @@ Calling `pool.end` will drain the pool of all active clients, disconnect them, a ```js // again both promises and callbacks are supported: -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() -// either this: -pool.end(() => { - console.log('pool has ended') -}) - -// or this: -pool.end().then(() => console.log('pool has ended')) +await pool.end() ``` ## properties @@ -266,7 +218,7 @@ If the backend goes down or a network partition is encountered all the idle, con The error listener is passed the error as the first argument and the client upon which the error occurred as the 2nd argument. The client will be automatically terminated and removed from the pool, it is only passed to the error handler in case you want to inspect it. -
You probably want to add an event listener to the pool to catch background errors errors!
+
You probably want to add an event listener to the pool to catch background errors!
Just like other event emitters, if a pool emits an error event and no listeners are added node will emit an uncaught error and potentially crash your node process.
diff --git a/docs/pages/apis/result.mdx b/docs/pages/apis/result.mdx index 8c130e88a..62888f112 100644 --- a/docs/pages/apis/result.mdx +++ b/docs/pages/apis/result.mdx @@ -18,7 +18,7 @@ Every result will have a rows array. If no rows are returned the array will be e Every result will have a fields array. This array contains the `name` and `dataTypeID` of each field in the result. These fields are ordered in the same order as the columns if you are using `arrayMode` for the query: ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() diff --git a/docs/pages/features/connecting.mdx b/docs/pages/features/connecting.mdx index b3c5ecc40..212af68fd 100644 --- a/docs/pages/features/connecting.mdx +++ b/docs/pages/features/connecting.mdx @@ -7,17 +7,12 @@ title: Connecting node-postgres uses the same [environment variables](https://www.postgresql.org/docs/9.1/static/libpq-envars.html) as libpq and psql to connect to a PostgreSQL server. Both individual clients & pools will use these environment variables. Here's a tiny program connecting node.js to the PostgreSQL server: ```js -const { Pool, Client } = require('pg') +import { Pool, Client } from 'pg' // pools will use environment variables // for connection information const pool = new Pool() -pool.query('SELECT NOW()', (err, res) => { - console.log(err, res) - pool.end() -}) - // you can also use async/await const res = await pool.query('SELECT NOW()') await pool.end() @@ -59,7 +54,7 @@ PGPORT=5432 node-postgres also supports configuring a pool or client programmatically with connection information. Here's our same script from above modified to use programmatic (hard-coded in this case) values. This can be useful if your application already has a way to manage config values or you don't want to use environment variables. ```js -const { Pool, Client } = require('pg') +import { Pool, Client } from 'pg' const pool = new Pool({ user: 'dbuser', @@ -69,10 +64,7 @@ const pool = new Pool({ port: 3211, }) -pool.query('SELECT NOW()', (err, res) => { - console.log(err, res) - pool.end() -}) +console.log(await pool.query('SELECT NOW()')) const client = new Client({ user: 'dbuser', @@ -81,19 +73,19 @@ const client = new Client({ password: 'secretpassword', port: 3211, }) -client.connect() -client.query('SELECT NOW()', (err, res) => { - console.log(err, res) - client.end() -}) +await client.connect() + +console.log(await client.query('SELECT NOW()')) + +await client.end() ``` Many cloud providers include alternative methods for connecting to database instances using short-lived authentication tokens. node-postgres supports dynamic passwords via a callback function, either synchronous or asynchronous. The callback function must resolve to a string. ```js -const { Pool } = require('pg') -const { RDS } = require('aws-sdk') +import { Pool } from 'pg' +import { RDS } from 'aws-sdk' const signerOptions = { credentials: { @@ -124,7 +116,7 @@ const pool = new Pool({ Connections to unix sockets can also be made. This can be useful on distros like Ubuntu, where authentication is managed via the socket connection instead of a password. ```js -const { Client } = require('pg') +import { Client } from 'pg' client = new Client({ host: '/cloudsql/myproject:zone:mydb', user: 'username', @@ -138,25 +130,23 @@ client = new Client({ You can initialize both a pool and a client with a connection string URI as well. This is common in environments like Heroku where the database connection string is supplied to your application dyno through an environment variable. Connection string parsing brought to you by [pg-connection-string](https://github.com/iceddev/pg-connection-string). ```js -const { Pool, Client } = require('pg') +import { Pool, Client } from 'pg' const connectionString = 'postgresql://dbuser:secretpassword@database.server.com:3211/mydb' const pool = new Pool({ connectionString, }) -pool.query('SELECT NOW()', (err, res) => { - console.log(err, res) - pool.end() -}) +await pool.query('SELECT NOW()') +await pool.end() const client = new Client({ connectionString, }) -client.connect() -client.query('SELECT NOW()', (err, res) => { - console.log(err, res) - client.end() -}) +await client.connect() + +await client.query('SELECT NOW()') + +await client.end() ``` diff --git a/docs/pages/features/native.mdx b/docs/pages/features/native.mdx index 698d6817b..c6f860119 100644 --- a/docs/pages/features/native.mdx +++ b/docs/pages/features/native.mdx @@ -15,10 +15,11 @@ $ npm install pg pg-native Once `pg-native` is installed instead of requiring a `Client` or `Pool` constructor from `pg` you do the following: ```js -const { Client, Pool } = require('pg').native +import { native } from 'pg' +const { Client, Pool } = native ``` -When you access the `.native` property on `require('pg')` it will automatically require the `pg-native` package and wrap it in the same API. +When you access the `.native` property on `'pg'` it will automatically require the `pg-native` package and wrap it in the same API.
Care has been taken to normalize between the two, but there might still be edge cases where things behave subtly differently due to the nature of using libpq over handling the binary protocol directly in JavaScript, so it's recommended you chose to either use the JavaScript driver or the native bindings both in development and production. For what its worth: I use the pure JavaScript driver because the JavaScript driver is more portable (doesn't need a compiler), and the pure JavaScript driver is plenty fast. diff --git a/docs/pages/features/pooling.mdx b/docs/pages/features/pooling.mdx index e291080f2..1e4e0cde2 100644 --- a/docs/pages/features/pooling.mdx +++ b/docs/pages/features/pooling.mdx @@ -28,7 +28,7 @@ The client pool allows you to have a reusable pool of clients you can check out, ### Checkout, use, and return ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() @@ -39,46 +39,11 @@ pool.on('error', (err, client) => { process.exit(-1) }) -// callback - checkout a client -pool.connect((err, client, done) => { - if (err) throw err - client.query('SELECT * FROM users WHERE id = $1', [1], (err, res) => { - done() - - if (err) { - console.log(err.stack) - } else { - console.log(res.rows[0]) - } - }) -}) - -// promise - checkout a client -pool.connect().then((client) => { - return client - .query('SELECT * FROM users WHERE id = $1', [1]) - .then((res) => { - client.release() - console.log(res.rows[0]) - }) - .catch((err) => { - client.release() - console.log(err.stack) - }) -}) +const client = await pool.connect() +const res = await client.query('SELECT * FROM users WHERE id = $1', [1]) +console.log(res.rows[0]) -// async/await - check out a client -;(async () => { - const client = await pool.connect() - try { - const res = await client.query('SELECT * FROM users WHERE id = $1', [1]) - console.log(res.rows[0]) - } catch (err) { - console.log(err.stack) - } finally { - client.release() - } -})() +await client.release() ``` @@ -95,44 +60,12 @@ pool.connect().then((client) => { If you don't need a transaction or you just need to run a single query, the pool has a convenience method to run a query on any available client in the pool. This is the preferred way to query with node-postgres if you can as it removes the risk of leaking a client. ```js -const { Pool } = require('pg') - -const pool = new Pool() - -pool.query('SELECT * FROM users WHERE id = $1', [1], (err, res) => { - if (err) { - throw err - } - - console.log('user:', res.rows[0]) -}) -``` - -node-postgres also has built-in support for promises throughout all of its async APIs. - -```js -const { Pool } = require('pg') - -const pool = new Pool() +import { Pool } from 'pg' -pool - .query('SELECT * FROM users WHERE id = $1', [1]) - .then((res) => console.log('user:', res.rows[0])) - .catch((err) => - setImmediate(() => { - throw err - }) - ) -``` - -Promises allow us to use `async`/`await` in node v8.0 and above (or earlier if you're using babel). - -```js -const { Pool } = require('pg') const pool = new Pool() -const { rows } = await pool.query('SELECT * FROM users WHERE id = $1', [1]) -console.log('user:', rows[0]) +const res = await pool.query('SELECT * FROM users WHERE id = $1', [1]) +console.log('user:', res.rows[0]) ``` ### Shutdown @@ -140,7 +73,7 @@ console.log('user:', rows[0]) To shut down a pool call `pool.end()` on the pool. This will wait for all checked-out clients to be returned and then shut down all the clients and the pool timers. ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() console.log('starting async query') diff --git a/docs/pages/features/queries.mdx b/docs/pages/features/queries.mdx index 0deef0d0d..a2f6c6a5b 100644 --- a/docs/pages/features/queries.mdx +++ b/docs/pages/features/queries.mdx @@ -3,27 +3,14 @@ title: Queries slug: /features/queries --- -The api for executing queries supports both callbacks and promises. I'll provide an example for both _styles_ here. For the sake of brevity I am using the `client.query` method instead of the `pool.query` method - both methods support the same API. In fact, `pool.query` delegates directly to `client.query` internally. +For the sake of brevity I am using the `client.query` method instead of the `pool.query` method - both methods support the same API. In fact, `pool.query` delegates directly to `client.query` internally. ## Text only If your query has no parameters you do not need to include them to the query method: ```js -// callback -client.query('SELECT NOW() as now', (err, res) => { - if (err) { - console.log(err.stack) - } else { - console.log(res.rows[0]) - } -}) - -// promise -client - .query('SELECT NOW() as now') - .then(res => console.log(res.rows[0])) - .catch(e => console.error(e.stack)) +await client.query('SELECT NOW() as now') ``` ## Parameterized query @@ -34,33 +21,9 @@ If you are passing parameters to your queries you will want to avoid string conc const text = 'INSERT INTO users(name, email) VALUES($1, $2) RETURNING *' const values = ['brianc', 'brian.m.carlson@gmail.com'] -// callback -client.query(text, values, (err, res) => { - if (err) { - console.log(err.stack) - } else { - console.log(res.rows[0]) - // { name: 'brianc', email: 'brian.m.carlson@gmail.com' } - } -}) - -// promise -client - .query(text, values) - .then(res => { - console.log(res.rows[0]) - // { name: 'brianc', email: 'brian.m.carlson@gmail.com' } - }) - .catch(e => console.error(e.stack)) - -// async/await -try { - const res = await client.query(text, values) - console.log(res.rows[0]) - // { name: 'brianc', email: 'brian.m.carlson@gmail.com' } -} catch (err) { - console.log(err.stack) -} +const res = await client.query(text, values) +console.log(res.rows[0]) +// { name: 'brianc', email: 'brian.m.carlson@gmail.com' } ```
@@ -112,20 +75,8 @@ const query = { values: ['brianc', 'brian.m.carlson@gmail.com'], } -// callback -client.query(query, (err, res) => { - if (err) { - console.log(err.stack) - } else { - console.log(res.rows[0]) - } -}) - -// promise -client - .query(query) - .then(res => console.log(res.rows[0])) - .catch(e => console.error(e.stack)) +const res = await client.query(query) +console.log(res.rows[0]) ``` The query config object allows for a few more advanced scenarios: @@ -142,20 +93,8 @@ const query = { values: [1], } -// callback -client.query(query, (err, res) => { - if (err) { - console.log(err.stack) - } else { - console.log(res.rows[0]) - } -}) - -// promise -client - .query(query) - .then(res => console.log(res.rows[0])) - .catch(e => console.error(e.stack)) +const res = await client.query(query) +console.log(res.rows[0]) ``` In the above example the first time the client sees a query with the name `'fetch-user'` it will send a 'parse' request to the PostgreSQL server & execute the query as normal. The second time, it will skip the 'parse' request and send the _name_ of the query to the PostgreSQL server. @@ -177,24 +116,9 @@ const query = { rowMode: 'array', } -// callback -client.query(query, (err, res) => { - if (err) { - console.log(err.stack) - } else { - console.log(res.fields.map(field => field.name)) // ['first_name', 'last_name'] - console.log(res.rows[0]) // ['Brian', 'Carlson'] - } -}) - -// promise -client - .query(query) - .then(res => { - console.log(res.fields.map(field => field.name)) // ['first_name', 'last_name'] - console.log(res.rows[0]) // ['Brian', 'Carlson'] - }) - .catch(e => console.error(e.stack)) +const res = await client.query(query) +console.log(res.fields.map(field => field.name)) // ['first_name', 'last_name'] +console.log(res.rows[0]) // ['Brian', 'Carlson'] ``` ### Types diff --git a/docs/pages/features/ssl.mdx b/docs/pages/features/ssl.mdx index 0428d0549..95683aca1 100644 --- a/docs/pages/features/ssl.mdx +++ b/docs/pages/features/ssl.mdx @@ -25,24 +25,15 @@ const config = { import { Client, Pool } from 'pg' const client = new Client(config) -client.connect(err => { - if (err) { - console.error('error connecting', err.stack) - } else { - console.log('connected') - client.end() - } -}) +await client.connect() +console.log('connected') +await client.end() const pool = new Pool(config) -pool - .connect() - .then(client => { - console.log('connected') - client.release() - }) - .catch(err => console.error('error connecting', err.stack)) - .then(() => pool.end()) +const pooledClient = await pool.connect() +console.log('connected') +pooledClient.release() +await pool.end() ``` ## Usage with `connectionString` diff --git a/docs/pages/features/transactions.mdx b/docs/pages/features/transactions.mdx index 408db52f8..492cbbe0e 100644 --- a/docs/pages/features/transactions.mdx +++ b/docs/pages/features/transactions.mdx @@ -15,16 +15,10 @@ To execute a transaction with node-postgres you simply execute `BEGIN / COMMIT / ## Examples -### async/await - -Things are considerably more straightforward if you're using async/await: - ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() -// note: we don't try/catch this because if connecting throws an exception -// we don't need to dispose of the client (it will be undefined) const client = await pool.connect() try { @@ -42,52 +36,4 @@ try { } finally { client.release() } -``` - -### callbacks - -node-postgres is a very old library, and still has an optional callback API. Here's an example of doing the same code above, but with callbacks: - -```js -const { Pool } = require('pg') -const pool = new Pool() - -pool.connect((err, client, done) => { - const shouldAbort = (err) => { - if (err) { - console.error('Error in transaction', err.stack) - client.query('ROLLBACK', (err) => { - if (err) { - console.error('Error rolling back client', err.stack) - } - // release the client back to the pool - done() - }) - } - return !!err - } - - client.query('BEGIN', (err) => { - if (shouldAbort(err)) return - const queryText = 'INSERT INTO users(name) VALUES($1) RETURNING id' - client.query(queryText, ['brianc'], (err, res) => { - if (shouldAbort(err)) return - - const insertPhotoText = 'INSERT INTO photos(user_id, photo_url) VALUES ($1, $2)' - const insertPhotoValues = [res.rows[0].id, 's3.bucket.foo'] - client.query(insertPhotoText, insertPhotoValues, (err, res) => { - if (shouldAbort(err)) return - - client.query('COMMIT', (err) => { - if (err) { - console.error('Error committing transaction', err.stack) - } - done() - }) - }) - }) - }) -}) -``` - -..thank goodness for `async/await` yeah? +``` \ No newline at end of file diff --git a/docs/pages/guides/async-express.md b/docs/pages/guides/async-express.md index 3be6d955a..982fdc50c 100644 --- a/docs/pages/guides/async-express.md +++ b/docs/pages/guides/async-express.md @@ -22,21 +22,18 @@ That's the same structure I used in the [project structure](/guides/project-stru My `db/index.js` file usually starts out like this: ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() -module.exports = { - query: (text, params) => pool.query(text, params), -} +export const query = (text, params) => pool.query(text, params); ``` Then I will install [express-promise-router](https://www.npmjs.com/package/express-promise-router) and use it to define my routes. Here is my `routes/user.js` file: ```js -const Router = require('express-promise-router') - -const db = require('../db') +import Router from 'express-promise-router' +import db from '../db.js' // create a new express-promise-router // this has the same API as the normal express router except @@ -44,7 +41,7 @@ const db = require('../db') const router = new Router() // export our router to be mounted by the parent application -module.exports = router +export default router router.get('/:id', async (req, res) => { const { id } = req.params @@ -57,22 +54,24 @@ Then in my `routes/index.js` file I'll have something like this which mounts eac ```js // ./routes/index.js -const users = require('./user') -const photos = require('./photos') +import users from './user.js' +import photos from './photos.js' -module.exports = (app) => { +const mountRoutes = (app) => { app.use('/users', users) app.use('/photos', photos) // etc.. } + +export default mountRoutes ``` And finally in my `app.js` file where I bootstrap express I will have my `routes/index.js` file mount all my routes. The routes know they're using async functions but because of express-promise-router the main express app doesn't know and doesn't care! ```js // ./app.js -const express = require('express') -const mountRoutes = require('./routes') +import express from 'express' +import mountRoutes from './routes.js' const app = express() mountRoutes(app) diff --git a/docs/pages/guides/project-structure.md b/docs/pages/guides/project-structure.md index 742451daa..94dcc1a30 100644 --- a/docs/pages/guides/project-structure.md +++ b/docs/pages/guides/project-structure.md @@ -11,8 +11,6 @@ Whenever I am writing a project & using node-postgres I like to create a file wi ## example -_note: I am using callbacks in this example to introduce as few concepts as possible at a time, but the same is doable with promises or async/await_ - The location doesn't really matter - I've found it usually ends up being somewhat app specific and in line with whatever folder structure conventions you're using. For this example I'll use an express app structured like so: ``` @@ -29,14 +27,12 @@ The location doesn't really matter - I've found it usually ends up being somewha Typically I'll start out my `db/index.js` file like so: ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() -module.exports = { - query: (text, params, callback) => { - return pool.query(text, params, callback) - }, +export const query = (text, params, callback) => { + return pool.query(text, params, callback) } ``` @@ -45,15 +41,11 @@ That's it. But now everywhere else in my application instead of requiring `pg` d ```js // notice here I'm requiring my database adapter file // and not requiring node-postgres directly -const db = require('../db') - -app.get('/:id', (req, res, next) => { - db.query('SELECT * FROM users WHERE id = $1', [req.params.id], (err, result) => { - if (err) { - return next(err) - } - res.send(result.rows[0]) - }) +import * as db from '../db.js' + +app.get('/:id', async (req, res, next) => { + const result = await db.query('SELECT * FROM users WHERE id = $1', [req.params.id] + res.send(result.rows[0]) }) // ... many other routes in this file @@ -62,19 +54,16 @@ app.get('/:id', (req, res, next) => { Imagine we have lots of routes scattered throughout many files under our `routes/` directory. We now want to go back and log every single query that's executed, how long it took, and the number of rows it returned. If we had required node-postgres directly in every route file we'd have to go edit every single route - that would take forever & be really error prone! But thankfully we put our data access into `db/index.js`. Let's go add some logging: ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() -module.exports = { - query: (text, params, callback) => { - const start = Date.now() - return pool.query(text, params, (err, res) => { - const duration = Date.now() - start - console.log('executed query', { text, duration, rows: res.rowCount }) - callback(err, res) - }) - }, +export const query = async (text, params) => { + const start = Date.now() + const res = await pool.query(text, params) + const duration = Date.now() - start + console.log('executed query', { text, duration, rows: res.rowCount }) + return res } ``` @@ -85,112 +74,57 @@ _note: I didn't log the query parameters. Depending on your application you migh Now what if we need to check out a client from the pool to run several queries in a row in a transaction? We can add another method to our `db/index.js` file when we need to do this: ```js -const { Pool } = require('pg') +import { Pool } from 'pg' const pool = new Pool() -module.exports = { - query: (text, params, callback) => { - const start = Date.now() - return pool.query(text, params, (err, res) => { - const duration = Date.now() - start - console.log('executed query', { text, duration, rows: res.rowCount }) - callback(err, res) - }) - }, - getClient: (callback) => { - pool.connect((err, client, done) => { - callback(err, client, done) - }) - }, +export const query = async (text, params) => { + const start = Date.now() + const res = await pool.query(text, params) + const duration = Date.now() - start + console.log('executed query', { text, duration, rows: res.rowCount }) + return res +} + +export const getClient = () => { + return pool.connect() } ``` Okay. Great - the simplest thing that could possibly work. It seems like one of our routes that checks out a client to run a transaction is forgetting to call `done` in some situation! Oh no! We are leaking a client & have hundreds of these routes to go audit. Good thing we have all our client access going through this single file. Lets add some deeper diagnostic information here to help us track down where the client leak is happening. ```js -const { Pool } = require('pg') - -const pool = new Pool() - -module.exports = { - query: (text, params, callback) => { - const start = Date.now() - return pool.query(text, params, (err, res) => { - const duration = Date.now() - start - console.log('executed query', { text, duration, rows: res.rowCount }) - callback(err, res) - }) - }, - getClient: (callback) => { - pool.connect((err, client, done) => { - const query = client.query - - // monkey patch the query method to keep track of the last query executed - client.query = (...args) => { - client.lastQuery = args - return query.apply(client, args) - } - - // set a timeout of 5 seconds, after which we will log this client's last query - const timeout = setTimeout(() => { - console.error('A client has been checked out for more than 5 seconds!') - console.error(`The last executed query on this client was: ${client.lastQuery}`) - }, 5000) - - const release = (err) => { - // call the actual 'done' method, returning this client to the pool - done(err) - - // clear our timeout - clearTimeout(timeout) - - // set the query method back to its old un-monkey-patched version - client.query = query - } - - callback(err, client, release) - }) - }, +export const query = async (text, params) => { + const start = Date.now() + const res = await pool.query(text, params) + const duration = Date.now() - start + console.log('executed query', { text, duration, rows: res.rowCount }) + return res } -``` -Using async/await: - -```js -module.exports = { - async query(text, params) { - const start = Date.now() - const res = await pool.query(text, params) - const duration = Date.now() - start - console.log('executed query', { text, duration, rows: res.rowCount }) - return res - }, - - async getClient() { - const client = await pool.connect() - const query = client.query - const release = client.release - // set a timeout of 5 seconds, after which we will log this client's last query - const timeout = setTimeout(() => { - console.error('A client has been checked out for more than 5 seconds!') - console.error(`The last executed query on this client was: ${client.lastQuery}`) - }, 5000) - // monkey patch the query method to keep track of the last query executed - client.query = (...args) => { - client.lastQuery = args - return query.apply(client, args) - } - client.release = () => { - // clear our timeout - clearTimeout(timeout) - // set the methods back to their old un-monkey-patched version - client.query = query - client.release = release - return release.apply(client) - } - return client - }, +export const getClient = async () => { + const client = await pool.connect() + const query = client.query + const release = client.release + // set a timeout of 5 seconds, after which we will log this client's last query + const timeout = setTimeout(() => { + console.error('A client has been checked out for more than 5 seconds!') + console.error(`The last executed query on this client was: ${client.lastQuery}`) + }, 5000) + // monkey patch the query method to keep track of the last query executed + client.query = (...args) => { + client.lastQuery = args + return query.apply(client, args) + } + client.release = () => { + // clear our timeout + clearTimeout(timeout) + // set the methods back to their old un-monkey-patched version + client.query = query + client.release = release + return release.apply(client) + } + return client } ``` diff --git a/docs/pages/guides/upgrading.md b/docs/pages/guides/upgrading.md index 2a1d311a2..e3bd941c8 100644 --- a/docs/pages/guides/upgrading.md +++ b/docs/pages/guides/upgrading.md @@ -102,7 +102,7 @@ If you do **not** pass a callback `client.query` will return an instance of a `P `client.query` has always accepted any object that has a `.submit` method on it. In this scenario the client calls `.submit` on the object, delegating execution responsibility to it. In this situation the client also **returns the instance it was passed**. This is how [pg-cursor](https://github.com/brianc/node-pg-cursor) and [pg-query-stream](https://github.com/brianc/node-pg-query-stream) work. So, if you need the event emitter functionality on your queries for some reason, it is still possible because `Query` is an instance of `Submittable`: ```js -const { Client, Query } = require('pg') +import { Client, Query } from 'pg' const query = client.query(new Query('SELECT NOW()')) query.on('row', row => {}) query.on('end', res => {}) diff --git a/docs/pages/index.mdx b/docs/pages/index.mdx index 2e14116b5..d785d327f 100644 --- a/docs/pages/index.mdx +++ b/docs/pages/index.mdx @@ -15,7 +15,7 @@ $ npm install pg node-postgres continued development and support is made possible by the many [supporters](https://github.com/brianc/node-postgres/blob/master/SPONSORS.md). -If you or your company would like to sponsor node-postgres stop by [github sponsors](https://github.com/sponsors/brianc) and sign up or feel free to [email me](mailto:brian@pecanware.com) if you want to add your logo to the documentation or discuss higher tiers of sponsorship! +If you or your company would like to sponsor node-postgres stop by [GitHub Sponsors](https://github.com/sponsors/brianc) and sign up or feel free to [email me](mailto:brian@pecanware.com) if you want to add your logo to the documentation or discuss higher tiers of sponsorship! # Version compatibility @@ -23,10 +23,10 @@ node-postgres strives to be compatible with all recent lts versions of node & th ## Getting started -This is the simplest possible way to connect, query, and disconnect with async/await: +The simplest possible way to connect, query, and disconnect is with async/await: ```js -const { Client } = require('pg') +import { Client } from 'pg' const client = new Client() await client.connect() @@ -35,18 +35,40 @@ console.log(res.rows[0].message) // Hello world! await client.end() ``` -And here's the same thing with callbacks: +### Error Handling -```js -const { Client } = require('pg') +For the sake of simplicity, these docs will assume that the methods are successful. In real life use, make sure to properly handle errors thrown in the methods. A `try/catch` block is a great way to do so: + +```ts +import { Client } from 'pg' const client = new Client() +await client.connect() + +try { + const res = await client.query('SELECT $1::text as message', ['Hello world!']) + console.log(res.rows[0].message) // Hello world! +} catch (err) { + console.error(err); +} finally { + await client.end() +} +``` -client.connect() +### Callbacks -client.query('SELECT $1::text as message', ['Hello world!'], (err, res) => { - console.log(err ? err.stack : res.rows[0].message) // Hello World! - client.end() +If you prefer a callback-style approach to asynchronous programming, all async methods support an optional callback parameter as well: + +```js +import { Client } from 'pg' +const client = new Client() + +client.connect((err) => { + client.query('SELECT $1::text as message', ['Hello world!'], (err, res) => { + console.log(err ? err.stack : res.rows[0].message) // Hello World! + client.end() + }) }) + ``` Our real-world apps are almost always more complicated than that, and I urge you to read on!