Building a robust, scalable backend with Node.js and PostgreSQL is a critical task for many modern businesses, especially startups. Yet, even with these powerful tools, some common pitfalls can turn a promising application into a maintenance nightmare. Let's discuss five key mistakes to avoid while developing a backend with Node.js and PostgreSQL, along with practical examples.

1. Not Using Connection Pooling

In a high-load application, opening a new connection for every database request can seriously hamper performance. Establishing a connection is a computationally expensive process, and without proper management, your application may suffer.

Solution: Use connection pooling, a technique that allows your application to reuse existing connections. The pg module in Node.js supports this out of the box.

const { Pool } = require('pg');
const pool = new Pool();

pool.query('SELECT NOW()', (err, res) => {
  console.log(err, res);
  pool.end();
});

2. Neglecting SQL Injection Protection

SQL injection is a notorious web application security vulnerability. Attackers exploit this by manipulating SQL queries using malicious input, leading to potential data breaches.

Solution: Always treat user input as untrusted. Use parameterized queries or prepared statements to ensure security. The pg module supports parameterized queries.

const text = 'INSERT INTO users(name, email) VALUES($1, $2) RETURNING *';
const values = ['John', 'john@example.com'];
pool.query(text, values, (err, res) => {
  if (err) {
    console.log(err.stack);
  } else {
    console.log(res.rows[0]);
  }
});

3. Ignoring Error Handling

Error handling in Node.js can be tricky due to its asynchronous nature. Ignoring or inadequately handling errors can lead to unresponsive applications and make debugging a nightmare.

Solution: Implement robust error handling in your application. Use async/await with try/catch blocks for cleaner, more readable code.

app.post('/users', async (req, res) => {
  const { name, email } = req.body;
  try {
    const result = await pool.query(
      'INSERT INTO users (name, email) VALUES ($1, $2) RETURNING *',
      [name, email]
    );
    res.status(201).json(result.rows[0]);
  } catch (err) {
    res.status(500).json({ message: err.message });
  }
});

4. Hard-Coding Sensitive Information

Hard-coding sensitive information, like database credentials or API keys, in your code is a severe security risk. If your code is ever exposed or pushed to a public repository, your data could be at risk.

Solution: Use environment variables for storing sensitive data. In Node.js, you can use the dotenv package to load environment variables from a .env file.

require('dotenv').config();

const pool = new Pool({
  user: process.env.DB_USER,
  host: process.env.DB_HOST,
  database: process.env.DB_NAME,
  password: process.env.DB_PASSWORD,
  port: process.env.DB_PORT,
});

5. Overlooking Database Indexing

Postgres does a great job with query performance on smaller data sets, but as your data grows, queries can start to slow down. Overlooking the use of indexing can lead to performance issues.

Solution: Use indexing wisely in your database. Indexes can speed up read queries by allowing the database to look up data without scanning the entire table.

CREATE INDEX idx_users_email
ON users (email);

Remember, adding too many indexes can slow down write operations because each index needs to be updated when a change is made to the table. Therefore, it's all about finding the right balance.

By avoiding these common mistakes, you can harness the full potential of Node.js and PostgreSQL, creating efficient, secure, and scalable backends for your applications.