Skip to main content

SQL Session Management

SQL Session Management

Think of database connections like phone calls to a busy service center. You want to minimize the number of calls, keep them as short as possible, handle any miscommunications gracefully, and make sure you hang up properly when done.

Why Database Session Management Matters

Proper database session management is crucial for several reasons:

  1. Performance: Database connections are resource-intensive and often limited
  2. Reliability: Properly managed sessions prevent resource leaks and application crashes
  3. Consistency: Well-managed transactions maintain data integrity
  4. Scalability: Efficient connection pooling allows applications to handle more concurrent users
  5. Security: Proper session cleanup prevents unauthorized access through lingering connections

Key Concepts in SQL Session Management

🔌

Database Connection

A communication channel between your application and the database server

Example: Creating a new connection: new SqlConnection(connectionString)
🏊

Connection Pool

A cache of database connections maintained for reuse

Example: Most modern ORMs and database libraries use connection pooling by default
🔄

Transaction

A sequence of operations performed as a single logical unit of work

Example: BEGIN TRANSACTION; UPDATE accounts; COMMIT TRANSACTION;
👤

Session

A series of interactions between the client and the database within a connection

Example: All queries executed during a user request
🔗

Connection String

Information needed to establish a connection to a database

Example: Server=myserver.database.windows.net;Database=mydb;User ID=username;Password=password;

Connection Management Patterns

Connection Pooling

Connection pooling is a technique used to improve performance by reusing database connections rather than creating new ones for each client request.

Connection Pooling

Instead of everyone in your office making their own coffee each time (boiling water, grinding beans, etc.), you have a few coffee pots ready that people can use and return when done.

How Connection Pooling Works

  1. When your application starts, the pool is initialized with a minimum number of connections
  2. When a connection is requested, the pool provides an idle connection if available
  3. If no connections are available and the maximum pool size hasn't been reached, a new connection is created
  4. When a connection is released, it's returned to the pool for future use
  5. Connections that have been idle for too long may be closed to free up resources

Connection Pooling Implementation

Comparing manual connection management vs. using a connection pool

Without PoolingAvoid
// Without connection pooling
function executeQuery(sql, params) {
// Create a new connection for every query
const connection = new SqlConnection(connectionString);
try {
// Open a new connection
connection.open();

const command = new SqlCommand(sql, connection);
if (params) {
for (const key in params) {
command.parameters.add(new SqlParameter(key, params[key]));
}
}

return command.executeQuery();
} finally {
// Close connection after each query
connection.close();
}
}
With PoolingRecommended
// With connection pooling
// Connection pool is created once at application startup
const pool = new SqlConnectionPool(connectionString);

async function executeQuery(sql, params) {
// Get a connection from the pool
const connection = await pool.getConnection();
try {
const command = new SqlCommand(sql, connection);
if (params) {
for (const key in params) {
command.parameters.add(new SqlParameter(key, params[key]));
}
}

return await command.executeQueryAsync();
} finally {
// Return connection to the pool instead of closing
connection.release();
}
}

Connection Pool Configuration

Optimizing your connection pool settings is crucial for performance:

// Example pool configuration
const pool = mysql.createPool({
host: 'localhost',
user: 'user',
password: 'password',
database: 'mydb',
// Core pool settings
connectionLimit: 10, // Maximum connections in the pool
minIdle: 2, // Minimum idle connections to maintain
idleTimeout: 60000, // How long a connection can be idle before being closed (ms)
// Advanced settings
acquireTimeout: 10000, // Maximum time to wait for a connection from the pool
createTimeout: 30000, // Maximum time to wait for a new connection to be established
waitForConnections: true // Whether to queue connection requests when none are available
});

Unit of Work Pattern

The Unit of Work pattern tracks all database operations to be performed during a business transaction and coordinates the writing out of changes.

Unit of Work Pattern

Like a shopping cart that holds all your items until you checkout. You can add, modify, or remove items, but nothing is actually purchased until you complete the entire transaction.

Unit of Work Implementation

Managing related database operations cohesively

Without Unit of WorkAvoid
// Without Unit of Work pattern
async function transferMoney(fromAccountId, toAccountId, amount) {
// Get connection and create two separate transactions
const fromConnection = await getConnection();
const toConnection = await getConnection();

try {
// Start transactions separately
await fromConnection.beginTransaction();
await toConnection.beginTransaction();

// Deduct from first account
await fromConnection.query(
'UPDATE accounts SET balance = balance - ? WHERE id = ?',
[amount, fromAccountId]
);

// What if an error happens here?
// One account might be updated while the other isn't

// Add to second account
await toConnection.query(
'UPDATE accounts SET balance = balance + ? WHERE id = ?',
[amount, toAccountId]
);

// Commit transactions separately
await fromConnection.commit();
await toConnection.commit();
} catch (error) {
// Attempt to roll back, but what if one fails and one succeeds?
try { await fromConnection.rollback(); } catch {}
try { await toConnection.rollback(); } catch {}
throw error;
} finally {
// Release connections
fromConnection.release();
toConnection.release();
}
}
With Unit of WorkRecommended
// With Unit of Work pattern
async function transferMoney(fromAccountId, toAccountId, amount) {
// Create a single unit of work
const unitOfWork = new UnitOfWork();

try {
// Begin a single transaction
await unitOfWork.beginTransaction();

// Register all operations within the same transaction
await unitOfWork.query(
'UPDATE accounts SET balance = balance - ? WHERE id = ?',
[amount, fromAccountId]
);

await unitOfWork.query(
'UPDATE accounts SET balance = balance + ? WHERE id = ?',
[amount, toAccountId]
);

// Insert into transaction log
await unitOfWork.query(
'INSERT INTO transaction_log (from_account, to_account, amount) VALUES (?, ?, ?)',
[fromAccountId, toAccountId, amount]
);

// Commit the entire unit of work
await unitOfWork.commit();
} catch (error) {
// Roll back the entire unit of work
await unitOfWork.rollback();
throw error;
} finally {
// Clean up resources
await unitOfWork.release();
}
}

Transaction Management

Proper transaction management ensures data integrity and consistency in your database.

Transaction Isolation Levels

Transaction Isolation Levels

It's like setting privacy rules for how people working on different things can see each other's work - from complete isolation (each person works in a separate room) to complete visibility (everyone in the same room).

SQL databases typically offer these isolation levels:

  1. Read Uncommitted: Lowest isolation; allows dirty reads (viewing uncommitted changes)
  2. Read Committed: Prevents dirty reads; only committed changes are visible
  3. Repeatable Read: Ensures that reads within a transaction are consistent
  4. Serializable: Highest isolation; ensures transactions execute as if they were sequential

Setting Transaction Isolation Levels

Configuring the appropriate isolation level based on requirements

Excessive IsolationAvoid
// Using a high isolation level when unnecessary
async function getUserReport(userId) {
const connection = await pool.getConnection();
try {
// Using serializable (highest isolation) when not needed
await connection.query('SET TRANSACTION ISOLATION LEVEL SERIALIZABLE');
await connection.beginTransaction();

// This query just reads data and doesn't need high isolation
const userData = await connection.query(
'SELECT * FROM users WHERE id = ?',
[userId]
);

const userOrders = await connection.query(
'SELECT * FROM orders WHERE user_id = ?',
[userId]
);

await connection.commit();
return { user: userData[0], orders: userOrders };
} catch (error) {
await connection.rollback();
throw error;
} finally {
connection.release();
}
}
Appropriate IsolationRecommended
// Using appropriate isolation level for the task
async function getUserReport(userId) {
const connection = await pool.getConnection();
try {
// Using read committed - appropriate for read-only operations
await connection.query('SET TRANSACTION ISOLATION LEVEL READ COMMITTED');
await connection.beginTransaction();

const userData = await connection.query(
'SELECT * FROM users WHERE id = ?',
[userId]
);

const userOrders = await connection.query(
'SELECT * FROM orders WHERE user_id = ?',
[userId]
);

await connection.commit();
return { user: userData[0], orders: userOrders };
} catch (error) {
await connection.rollback();
throw error;
} finally {
connection.release();
}
}

// For operations requiring data consistency
async function transferFunds(fromAccount, toAccount, amount) {
const connection = await pool.getConnection();
try {
// Here, we need serializable to prevent concurrent transfers affecting balance
await connection.query('SET TRANSACTION ISOLATION LEVEL SERIALIZABLE');
await connection.beginTransaction();

// Check sufficient funds (needs to be consistent)
const [account] = await connection.query(
'SELECT balance FROM accounts WHERE id = ? FOR UPDATE',
[fromAccount]
);

if (account.balance < amount) {
throw new Error('Insufficient funds');
}

// Perform transfer
await connection.query(
'UPDATE accounts SET balance = balance - ? WHERE id = ?',
[amount, fromAccount]
);

await connection.query(
'UPDATE accounts SET balance = balance + ? WHERE id = ?',
[amount, toAccount]
);

await connection.commit();
} catch (error) {
await connection.rollback();
throw error;
} finally {
connection.release();
}
}

Concurrency Control

Concurrency Control

It's like managing multiple editors working on a document simultaneously - you need rules to prevent them from overwriting each other's changes while still allowing productive collaboration.

Optimistic vs. Pessimistic Concurrency

Concurrency Control Approaches

Comparing optimistic and pessimistic concurrency strategies

Pessimistic LockingAvoid
// Pessimistic concurrency control
async function updateUserProfile(userId, profileData) {
const connection = await pool.getConnection();
try {
await connection.beginTransaction();

// Lock the row exclusively - blocks other transactions
await connection.query(
'SELECT * FROM users WHERE id = ? FOR UPDATE',
[userId]
);

// Now update the data
await connection.query(
'UPDATE users SET name = ?, email = ?, bio = ? WHERE id = ?',
[profileData.name, profileData.email, profileData.bio, userId]
);

await connection.commit();
} catch (error) {
await connection.rollback();
throw error;
} finally {
connection.release();
}
}
Optimistic LockingRecommended
// Optimistic concurrency control
async function updateUserProfile(userId, profileData, version) {
const connection = await pool.getConnection();
try {
await connection.beginTransaction();

// No locks - instead use a version column for conflict detection
const result = await connection.query(
'UPDATE users SET name = ?, email = ?, bio = ?, version = version + 1 ' +
'WHERE id = ? AND version = ?',
[
profileData.name,
profileData.email,
profileData.bio,
userId,
version
]
);

// Check if update succeeded
if (result.affectedRows === 0) {
// No rows were updated - version mismatch
throw new Error('Concurrent update detected. Please refresh and try again.');
}

await connection.commit();
return version + 1; // Return new version
} catch (error) {
await connection.rollback();
throw error;
} finally {
connection.release();
}
}

Error Handling and Recovery

Database Error Handling

Like having a plan for when things go wrong at a restaurant - from minor issues (a slight delay) to major problems (kitchen fire). Good systems plan for errors and know how to recover gracefully.

Classification of Errors

  1. Connection Errors: Failed to establish or maintain a connection
  2. Constraint Violations: Attempts to insert/update data that violates integrity constraints
  3. Deadlocks: Concurrent transactions that block each other
  4. Timeouts: Operations that take too long to complete
  5. Serialization Failures: Concurrent transactions that violate isolation guarantees

Error Recovery Strategies

Error Handling and Recovery

Implementing robust error handling with retry logic

Basic Error HandlingAvoid
// Basic error handling without retry
async function executeQuery(sql, params) {
const connection = await pool.getConnection();
try {
await connection.beginTransaction();
const result = await connection.query(sql, params);
await connection.commit();
return result;
} catch (error) {
await connection.rollback();
// Simply rethrow the error
throw error;
} finally {
connection.release();
}
}
Advanced Error RecoveryRecommended
// Advanced error handling with categorization and retries
async function executeQuery(sql, params, options = {}) {
const {
maxRetries = 3,
initialDelay = 100,
exponentialFactor = 2
} = options;

let retryCount = 0;
let lastError = null;

while (retryCount <= maxRetries) {
const connection = await pool.getConnection();
try {
await connection.beginTransaction();
const result = await connection.query(sql, params);
await connection.commit();
return result;
} catch (error) {
await connection.rollback();
lastError = error;

// Categorize errors
const isRetryable = isRetryableError(error);

// Increment retry counter
retryCount++;

if (isRetryable && retryCount <= maxRetries) {
// Calculate exponential backoff delay
const delay = initialDelay * Math.pow(exponentialFactor, retryCount - 1);
console.log(`Retrying query after ${delay}ms (attempt ${retryCount})`);
await sleep(delay);
// Continue to next retry iteration
} else {
// Not retryable or max retries exceeded
throw enhanceError(error, {
sql,
params,
retryCount
});
}
} finally {
connection.release();
}
}

// This code shouldn't be reached, but just in case
throw lastError;
}

// Helper function to determine if error is retryable
function isRetryableError(error) {
// Deadlocks, connection issues and timeouts are typically retryable
return (
error.code === 'ER_LOCK_DEADLOCK' ||
error.code === 'ECONNRESET' ||
error.code === 'ER_QUERY_TIMEOUT'
);
}

// Helper to enhance error with context
function enhanceError(error, context) {
error.queryContext = context;
return error;
}

// Helper for async sleep
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}

Best Practices for SQL Session Management

Connection String Security

Connection String Security

Like keeping your house keys secure - you don't leave them out in the open, you don't make unnecessary copies, and you change the locks periodically for better security.

Securing Connection Strings

Properly managing database credentials

Insecure PracticesAvoid
// Hard-coded connection string - NEVER do this
const connection = mysql.createConnection({
host: 'production-db.example.com',
user: 'admin',
password: 'super-secret-p@ssword!',
database: 'customer_data'
});

// Connection string directly in code
app.js:
const connectionString = 'Server=prod.example.com;Database=app_db;User ID=app_user;Password=Secr3t!';
const connection = new SqlConnection(connectionString);
Secure PracticesRecommended
// Using environment variables
const connection = mysql.createConnection({
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME
});

// Using a dedicated config service
const config = require('./config');
const dbConfig = await config.getSecureConfiguration('database');
const connection = new SqlConnection(dbConfig.connectionString);

// Using a secrets manager service
const { SecretsManager } = require('aws-sdk');
const secretsManager = new SecretsManager();

async function getDatabaseConnection() {
const secretData = await secretsManager.getSecretValue({
SecretId: 'prod/app/database'
}).promise();

const dbConfig = JSON.parse(secretData.SecretString);
return mysql.createConnection(dbConfig);
}

Parameterized Queries

Always use parameterized queries to prevent SQL injection attacks:

Using Parameterized Queries

Protecting against SQL injection attacks

Unsafe QueriesAvoid
// Vulnerable to SQL injection
function getUserByUsername(username) {
// DANGEROUS: Direct string concatenation
const query = `SELECT * FROM users WHERE username = '${username}'`;
return connection.query(query);
}

// Usage that could be exploited
getUserByUsername("admin' OR '1'='1"); // Returns all users!
Parameterized QueriesRecommended
// Safe from SQL injection
function getUserByUsername(username) {
// Parameters are properly escaped
const query = 'SELECT * FROM users WHERE username = ?';
return connection.query(query, [username]);
}

// Using named parameters
function getUserByCredentials(username, password) {
const query = 'SELECT * FROM users WHERE username = :username AND password_hash = :passwordHash';
return connection.query(query, {
username: username,
passwordHash: hashPassword(password)
});
}

Performance Monitoring

Database Performance Monitoring

Like monitoring traffic patterns on roads - you want to know where congestion happens, how long people wait, and whether certain intersections are causing problems, so you can improve the overall system.

Key metrics to monitor:

  1. Connection Pool Utilization: Track how many connections are active vs. idle
  2. Connection Wait Time: How long requests wait for an available connection
  3. Transaction Duration: Identify long-running transactions
  4. Query Performance: Track slow queries and optimization opportunities
  5. Connection Errors: Monitor frequency and patterns of connection failures
// Example instrumentation code for monitoring connection pool
function createInstrumentedPool(config) {
const pool = mysql.createPool(config);
const metrics = {
connectionsCreated: 0,
connectionsAcquired: 0,
connectionsReleased: 0,
connectionWaitTime: [], // Array of wait times
activeConnections: 0,
maxActiveConnections: 0,
errors: {
connection: 0,
query: 0,
timeout: 0
}
};

// Wrap getConnection to track metrics
const originalGetConnection = pool.getConnection.bind(pool);
pool.getConnection = async function() {
const startTime = Date.now();
metrics.connectionsAcquired++;

try {
const connection = await originalGetConnection();

// Track wait time
const waitTime = Date.now() - startTime;
metrics.connectionWaitTime.push(waitTime);

// Track active connections
metrics.activeConnections++;
metrics.maxActiveConnections = Math.max(
metrics.maxActiveConnections,
metrics.activeConnections
);

// Wrap release method to track returns
const originalRelease = connection.release.bind(connection);
connection.release = function() {
metrics.connectionsReleased++;
metrics.activeConnections--;
return originalRelease();
};

return connection;
} catch (error) {
metrics.errors.connection++;
throw error;
}
};

// Add method to get metrics
pool.getMetrics = function() {
return {
...metrics,
averageWaitTime: metrics.connectionWaitTime.length > 0
? metrics.connectionWaitTime.reduce((a, b) => a + b, 0) / metrics.connectionWaitTime.length
: 0,
connectionUtilization: config.connectionLimit > 0
? metrics.activeConnections / config.connectionLimit
: 0
};
};

return pool;
}

// Usage
const pool = createInstrumentedPool({
host: 'localhost',
user: 'app',
password: 'password',
database: 'myapp',
connectionLimit: 10
});

// Periodically report metrics
setInterval(() => {
const metrics = pool.getMetrics();
console.log('DB Pool Metrics:', metrics);

// Send to monitoring system
monitoringClient.gauge('db.connections.active', metrics.activeConnections);
monitoringClient.gauge('db.connections.utilization', metrics.connectionUtilization);
monitoringClient.histogram('db.connections.waitTime', metrics.averageWaitTime);
}, 60000);

Advanced Patterns

Repository Pattern

Repository Pattern

Like having a librarian who handles all the details of finding, organizing, and updating books. You simply ask for what you need, and the repository (librarian) handles all the complexities of storage and retrieval.

Repository Pattern Implementation

Creating a clean data access abstraction layer

Direct Database AccessAvoid
// Direct database access throughout application
// UserService.js
class UserService {
async getUserById(userId) {
const connection = await pool.getConnection();
try {
const [rows] = await connection.query(
'SELECT * FROM users WHERE id = ?',
[userId]
);
return rows[0];
} finally {
connection.release();
}
}

async updateUserEmail(userId, email) {
const connection = await pool.getConnection();
try {
await connection.beginTransaction();
await connection.query(
'UPDATE users SET email = ? WHERE id = ?',
[email, userId]
);
await connection.commit();
} catch (error) {
await connection.rollback();
throw error;
} finally {
connection.release();
}
}
}

// OrderService.js - duplicated database logic
class OrderService {
async getOrdersForUser(userId) {
const connection = await pool.getConnection();
try {
const [rows] = await connection.query(
'SELECT * FROM orders WHERE user_id = ?',
[userId]
);
return rows;
} finally {
connection.release();
}
}
}
Repository PatternRecommended
// Database access through repository pattern
// UserRepository.js
class UserRepository {
constructor(db) {
this.db = db;
}

async findById(id) {
return this.db.withConnection(async conn => {
const [rows] = await conn.query(
'SELECT * FROM users WHERE id = ?',
[id]
);
return rows[0] ? this.mapToEntity(rows[0]) : null;
});
}

async findByEmail(email) {
return this.db.withConnection(async conn => {
const [rows] = await conn.query(
'SELECT * FROM users WHERE email = ?',
[email]
);
return rows[0] ? this.mapToEntity(rows[0]) : null;
});
}

async update(user) {
return this.db.withTransaction(async conn => {
await conn.query(
'UPDATE users SET email = ?, name = ?, updated_at = ? WHERE id = ?',
[user.email, user.name, new Date(), user.id]
);
return user;
});
}

// Map database record to domain entity
mapToEntity(row) {
return {
id: row.id,
email: row.email,
name: row.name,
createdAt: row.created_at,
updatedAt: row.updated_at
};
}
}

// OrderRepository.js
class OrderRepository {
constructor(db) {
this.db = db;
}

async findByUserId(userId) {
return this.db.withConnection(async conn => {
const [rows] = await conn.query(
'SELECT * FROM orders WHERE user_id = ?',
[userId]
);
return rows.map(row => this.mapToEntity(row));
});
}

// Other methods...
}

// UserService.js - cleaner with repository
class UserService {
constructor(userRepository, orderRepository) {
this.userRepository = userRepository;
this.orderRepository = orderRepository;
}

async getUserWithOrders(userId) {
const user = await this.userRepository.findById(userId);
if (!user) {
throw new Error('User not found');
}

user.orders = await this.orderRepository.findByUserId(userId);
return user;
}

async updateUserEmail(userId, email) {
const user = await this.userRepository.findById(userId);
if (!user) {
throw new Error('User not found');
}

// Check if email is already in use
const existingUser = await this.userRepository.findByEmail(email);
if (existingUser && existingUser.id !== userId) {
throw new Error('Email already in use');
}

user.email = email;
return this.userRepository.update(user);
}
}

Database Abstraction Layer

Creating a robust database abstraction layer helps manage connections and transactions consistently:

// db.js - Database abstraction layer
class Database {
constructor(config) {
this.pool = mysql.createPool(config);
}

// Execute with a managed connection
async withConnection(callback) {
const connection = await this.pool.getConnection();
try {
return await callback(connection);
} finally {
connection.release();
}
}

// Execute in a transaction
async withTransaction(callback) {
return this.withConnection(async connection => {
try {
await connection.beginTransaction();
const result = await callback(connection);
await connection.commit();
return result;
} catch (error) {
await connection.rollback();
throw error;
}
});
}

// Simple query execution
async query(sql, params) {
return this.withConnection(conn => conn.query(sql, params));
}

// Transaction with isolation level
async withIsolation(level, callback) {
return this.withConnection(async connection => {
try {
await connection.query(`SET TRANSACTION ISOLATION LEVEL ${level}`);
await connection.beginTransaction();
const result = await callback(connection);
await connection.commit();
return result;
} catch (error) {
await connection.rollback();
throw error;
}
});
}

// Close the pool (for application shutdown)
async close() {
return this.pool.end();
}
}

// Usage
const db = new Database({
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
connectionLimit: 10
});

// Simple query
const users = await db.query('SELECT * FROM users WHERE active = ?', [true]);

// With transaction
await db.withTransaction(async conn => {
await conn.query('UPDATE accounts SET balance = balance - ? WHERE id = ?', [100, 1]);
await conn.query('UPDATE accounts SET balance = balance + ? WHERE id = ?', [100, 2]);
});

// With specific isolation level
await db.withIsolation('SERIALIZABLE', async conn => {
const [account] = await conn.query('SELECT balance FROM accounts WHERE id = ? FOR UPDATE', [1]);
if (account.balance < 100) {
throw new Error('Insufficient funds');
}
await conn.query('UPDATE accounts SET balance = balance - ? WHERE id = ?', [100, 1]);
await conn.query('INSERT INTO transfers (amount, account_id) VALUES (?, ?)', [100, 1]);
});

Common Pitfalls and Antipatterns

N+1 Query Problem

N+1 Query Problem

Like going to a library, getting a list of 100 book titles, and then making 100 separate trips to find each book. It would be much more efficient to get all the book details at once.

Solving the N+1 Query Problem

Optimizing related data retrieval

N+1 Query ProblemAvoid
// N+1 query problem
async function getUsersWithOrders() {
// First query to get users
const users = await db.query('SELECT * FROM users');

// For each user, execute another query (N additional queries)
for (const user of users) {
// This creates a separate query for EACH user
user.orders = await db.query(
'SELECT * FROM orders WHERE user_id = ?',
[user.id]
);
}

return users;
}

// With 100 users, this executes 101 separate database queries!
Optimized ApproachesRecommended
// Solution 1: Join query
async function getUsersWithOrders() {
const rows = await db.query(
'SELECT u.*, o.id as order_id, o.total, o.created_at as order_date ' +
'FROM users u ' +
'LEFT JOIN orders o ON u.id = o.user_id'
);

// Process the joined data
const usersMap = new Map();

for (const row of rows) {
if (!usersMap.has(row.id)) {
usersMap.set(row.id, {
id: row.id,
name: row.name,
email: row.email,
orders: []
});
}

if (row.order_id) {
const user = usersMap.get(row.id);
user.orders.push({
id: row.order_id,
total: row.total,
date: row.order_date
});
}
}

return Array.from(usersMap.values());
}

// Solution 2: Batch loading with IN clause
async function getUsersWithOrders() {
// First query to get users
const users = await db.query('SELECT * FROM users');

if (users.length === 0) {
return [];
}

// Get all user IDs
const userIds = users.map(user => user.id);

// Single query to get all orders for all users at once
const orders = await db.query(
'SELECT * FROM orders WHERE user_id IN (?)',
[userIds]
);

// Map orders to users
const ordersByUser = {};
for (const order of orders) {
if (!ordersByUser[order.user_id]) {
ordersByUser[order.user_id] = [];
}
ordersByUser[order.user_id].push(order);
}

// Assign orders to users
for (const user of users) {
user.orders = ordersByUser[user.id] || [];
}

return users;
}

Connection Leaks

Connection Leaks

Like borrowing books from a library and never returning them. Eventually, the library runs out of books to lend, and no one else can borrow anything.

Preventing Connection Leaks

Ensuring connections are always properly released

Connection LeaksAvoid
// Connection leak - missing release
async function getUser(userId) {
const connection = await pool.getConnection();

try {
const [rows] = await connection.query(
'SELECT * FROM users WHERE id = ?',
[userId]
);
return rows[0];
} catch (error) {
console.error('Error fetching user:', error);
// Connection is not released if an error occurs!
}

// If code path doesn't hit the finally block (early return, exception),
// the connection is leaked!
connection.release();
}

// Another leak - async/await misuse
async function updateUser(userId, userData) {
const connection = await pool.getConnection();

// No try/catch/finally at all
await connection.beginTransaction();

await connection.query(
'UPDATE users SET name = ?, email = ? WHERE id = ?',
[userData.name, userData.email, userId]
);

await connection.commit();

// If any of the above awaits reject, this line is never reached
connection.release();
}
Proper ManagementRecommended
// Proper connection management
async function getUser(userId) {
const connection = await pool.getConnection();

try {
const [rows] = await connection.query(
'SELECT * FROM users WHERE id = ?',
[userId]
);
return rows[0];
} catch (error) {
console.error('Error fetching user:', error);
throw error; // Re-throw to handle at a higher level
} finally {
// Always release the connection, even if an error occurs
connection.release();
}
}

// Using a helper function to ensure cleanup
async function withConnection(callback) {
const connection = await pool.getConnection();

try {
return await callback(connection);
} finally {
connection.release();
}
}

// Usage of helper
async function updateUser(userId, userData) {
return withConnection(async (connection) => {
try {
await connection.beginTransaction();

await connection.query(
'UPDATE users SET name = ?, email = ? WHERE id = ?',
[userData.name, userData.email, userId]
);

await connection.commit();
} catch (error) {
await connection.rollback();
throw error;
}
});
}

Conclusion

Effective SQL session management is crucial for building reliable, high-performance applications. By implementing proper connection pooling, transaction management, error handling, and following best practices for security and performance, you can ensure your database layer provides a solid foundation for your application.

Remember these key principles:

  1. Resource Efficiency: Reuse connections through pooling, minimize transaction durations
  2. Consistency: Ensure data integrity with proper transaction isolation and error handling
  3. Security: Protect connection strings, use parameterized queries
  4. Monitoring: Track performance metrics to identify and resolve bottlenecks
  5. Abstraction: Use patterns like Repository and Unit of Work to separate concerns

By applying these SQL session management techniques, you'll create database interactions that are more maintainable, performant, and resilient.