Boost PostgreSQL Performance With ADO.NET Batch Operations

Batch operations are essential for high-performance database applications. This article explores efficient batch-processing techniques for PostgreSQL using dotConnect for PostgreSQL in C# and .NET. We'll examine the limitations of traditional loop-based approaches and demonstrate optimized methods that leverage dotConnect's advanced capabilities to maximize throughput and minimize server overhead.

The problem with loop-based batch operations

The most straightforward approach to batch inserts or updates—executing DML statements in a loop—creates significant performance bottlenecks:

  • Excessive disk I/O caused by repeated execution of identical statements
  • Increased server load as each statement requires individual preparation and execution
  • Transaction processing delays that compound with the dataset size

A better approach

Optimized batch techniques handle multiple data changes in a single execution, either through consolidated DML commands or grouped statement execution. This article demonstrates how dotConnect for PostgreSQL enables these advanced patterns, with the optimal approach depending on your specific use case and PostgreSQL's native capabilities.

Batch insert and update strategies

There are various ways to execute batch operations. While the conventional one in ADO.NET involves using DataAdapter (see Perform batch operations using DataAdapters), dotConnect for PostgreSQL offers several alternative approaches, allowing users to select the one that best fits their needs. Additionally, they demonstrate better performance in specific scenarios.

In dotConnect for PostgreSQL, batch operations can be executed using:

Let's explore each method with practical examples and performance insights.

Batch operations in PostgreSQL

Create a PostgreSQL test table

Our examples will use the PostgreSQL test database, which contains the batch_test table.

With any SQL tool, you can create the test database and its table by using the following DDL statement:

CREATE TABLE batch_test(
id int primary key,
f_integer int,
f_varchar character varying(100)
);

Batch INSERT using PgSqlDataAdapter

Executing batch operations in ADO.NET using DataAdapter is a conventional and well-documented approach. According to the official guidelines, the process involves the following steps:

  • Prepare the data: Add multiple DataRow objects to the DataTable, or use a pre-filled DataTable from a DataSet.
  • Configure PgSqlDataAdapter: Initialize the InsertCommand, UpdateCommand, and DeleteCommand properties to define how DataTable changes translate into SQL operations.
  • Set the batch size: Specify the number of operations to group using the UpdateBatchSize property.
  • Execute the batch: Call the Update() method to apply all pending changes to the database.

Below is a code example illustrating how to perform a batch INSERT operation using PgSqlDataAdapter.

using System.Data;
using Devart.Data.PostgreSql;

namespace BatchPgSql
{
    class Program
    {
        static void Main(string[] args)
        {
            string connectionString = "Host=127.0.0.1;User Id=TestUser;Password=TestPassword;Database=TestDb;Schema=public;License key=**********";

            try
            {
                using (PgSqlConnection connection = new PgSqlConnection(connectionString))
                {
                    connection.Open();

                    // create DataTable
                    PgSqlDataTable dataTable = new PgSqlDataTable();
                    dataTable.Columns.Add("id", typeof(int));
                    dataTable.Columns.Add("f_integer", typeof(int));
                    dataTable.Columns.Add("f_varchar", typeof(string));

                    // fill DataTable with data
                    dataTable.Rows.Add([1, 1, "string 1"]);
                    dataTable.Rows.Add([2, 2, "string 2"]);
                    dataTable.Rows.Add([3, 3, "string 3"]);
                    dataTable.Rows.Add([4, 4, "string 4"]);
                    dataTable.Rows.Add([5, 5, "string 5"]);

                    // create PgSqlDataAdapter
                    PgSqlDataAdapter adapter = new PgSqlDataAdapter();

                    // set the INSERT command and parameters
                    adapter.InsertCommand = new PgSqlCommand("INSERT INTO batch_test" +
                    "(id, f_integer, f_varchar) " +
                    "VALUES" +
                    "(:id, :f_integer, :f_varchar)",
                    connection);
                    adapter.InsertCommand.Parameters.Add("id", PgSqlType.Int, 4, "id");
                    adapter.InsertCommand.Parameters.Add("f_integer", PgSqlType.Int, 4, "f_integer");
                    adapter.InsertCommand.Parameters.Add("f_varchar", PgSqlType.VarChar, 100, "f_varchar");
                    adapter.InsertCommand.UpdatedRowSource = UpdateRowSource.None;

                    // set the UPDATE command and parameters
                    adapter.UpdateCommand = new PgSqlCommand("UPDATE batch_test " +
                    "SET " +
                    "f_integer = :f_integer, f_varchar = :f_varchar " +
                    "WHERE " +
                    ":id = id",
                    connection);
                    adapter.UpdateCommand.Parameters.Add("id", PgSqlType.Int, 4, "id");
                    adapter.UpdateCommand.Parameters.Add("f_integer", PgSqlType.Int, 4, "f_integer");
                    adapter.UpdateCommand.Parameters.Add("f_varchar", PgSqlType.VarChar, 100, "f_varchar");
                    adapter.UpdateCommand.UpdatedRowSource = UpdateRowSource.None;

                    // set the DELETE command and parameter
                    adapter.DeleteCommand = new PgSqlCommand("DELETE FROM batch_test " +
                    "WHERE " +
                    ":id = id",
                    connection);
                    adapter.DeleteCommand.Parameters.Add("id", PgSqlType.Int, 4, "id");
                    adapter.DeleteCommand.UpdatedRowSource = UpdateRowSource.None;

                    // set the batch size
                    adapter.UpdateBatchSize = 5;

                    // execute the update
                    adapter.Update(dataTable);
                }

            }
            catch (Exception ex)
            {
                Console.WriteLine("An error occurred: " + ex.Message);
            }
        }
    }
}

Batch INSERT using PgSqlDataTable

As previously mentioned, executing batch operations using PgSqlDataAdapter involves multiple effort- and time-consuming steps such as creating DataAdapter, initializing and defining its parameters, and so on.

dotConnect for PostgreSQL provides a straightforward alternative using the PgSqlDataTable class, which streamlines the process to these simple steps:

  • Load initial data: Assign a SelectCommand to PgSqlDataTable and call Open() to populate it.
  • Modify data: Perform insert, update, or delete operations directly on PgSqlDataTable.
  • Set the batch size: Define how many operations to group together using the UpdateBatchSize property.
  • Execute the batch: Call Update() to apply all pending changes to the database.

Refer to the following code example that demonstrates the approach described above.

using Devart.Data.PostgreSql;

namespace BatchPgSql
{
    class Program
    {
        static void Main(string[] args)
        {
            string connectionString = "Host=127.0.0.1;User Id=TestUser;Password=TestPassword;Database=TestDb;Schema=public;License key=**********";

            try
            {
                using (PgSqlConnection connection = new PgSqlConnection(connectionString))
                {
                    connection.Open();

                    // create PgSqlDataTable
                    PgSqlDataTable dataTable = new PgSqlDataTable();
                    dataTable.SelectCommand = new PgSqlCommand("SELECT * FROM batch_test", connection);
                    dataTable.Open();

                    // add rows to PgSqlDataTable
                    dataTable.Rows.Add([1, 1, "string 1"]);
                    dataTable.Rows.Add([2, 2, "string 2"]);
                    dataTable.Rows.Add([3, 3, "string 3"]);
                    dataTable.Rows.Add([4, 4, "string 4"]);
                    dataTable.Rows.Add([5, 5, "string 5"]);

                    // set the batch size
                    dataTable.UpdateBatchSize = 5;

                    // execute the update
                    dataTable.Update();
                }

            }
            catch (Exception ex)
            {
                Console.WriteLine("An error occurred: " + ex.Message);
            }
        }
    }
}

The example shows that this method does not require users to manually create and initialize DataAdapter. The necessary statements, such as INSERT, UPDATE, and DELETE, will be automatically generated. However, if needed, you can explicitly initialize the InsertCommand, UpdateCommand, and DeleteCommand properties, just as you would with DataAdapter.

Batch INSERT using PgSqlCommand

Suppose you need to execute a single type of batch operation, such as INSERT, UPDATE, or DELETE. In that case, we recommend using the alternative method provided by dotConnect for PostgreSQL: batch operations with PgSqlCommand. Spoiler: this method stands out among the rest in terms of performance.

Support for batch operations using PgSqlCommand was introduced in version dotConnect 2025.1.

To execute batch operations with PgSqlCommand, follow these steps:

  • Set the DML statement: Define the INSERT, UPDATE, or DELETE statement in the CommandText property.
  • Bind arrays to parameters: Assign an array of values to each parameter, where each array element represents a value for a different execution.
  • Execute the batch: Call the ExecuteArray() method to execute the command for all parameter values in a single batch.

The example below illustrates the execution of batch operations using PgSqlCommand.

using Devart.Data.PostgreSql;

namespace BatchPgSql
{
    class Program
    {
        static void Main(string[] args)
        {
            string connectionString = "Host=127.0.0.1;User Id=TestUser;Password=TestPassword;Database=TestDb;Schema=public;License key=**********";

            try
            {
                using (PgSqlConnection connection = new PgSqlConnection(connectionString))
                {
                    connection.Open();

                    // create PgSqlCommand
                    PgSqlCommand command = new PgSqlCommand("INSERT INTO batch_test" +
                    "(id, f_integer, f_varchar) " +
                    "VALUES" +
                    "(:id, :f_integer, :f_varchar)", connection);
                    command.Parameters.Add("id", PgSqlType.Int, 4, "id");
                    command.Parameters.Add("f_integer", PgSqlType.Int, 4, "f_integer");
                    command.Parameters.Add("f_varchar", PgSqlType.VarChar, 100, "f_varchar");

                    // fill PgSqlCommand parameter values
                    command.Parameters["id"].Value = new int[5] { 1, 2, 3, 4, 5 };
                    command.Parameters["f_integer"].Value = new int[5] { 1, 2, 3, 4, 5 };
                    command.Parameters["f_varchar"].Value = new string[5] { "string 1", "string 2", "string 3", "string 4", "string 5" };
                    
                    // execute the update
                    command.ExecuteArray();
                }

            }
            catch (Exception ex)
            {
                Console.WriteLine("An error occurred: " + ex.Message);
            }
        }
    }
}

It is worth mentioning that, unlike DataAdapter and PgSqlDataTable, batch operations using PgSqlCommand do not allow explicitly setting the batch size, as there is no property equivalent to UpdateBatchSize. Instead, the optimal batch size is automatically calculated.

Performance comparison

To sum up, we present a comparative chart showing the time required to execute batch operations using each of the methods described in this article. This chart highlights the performance differences, helping you choose the most efficient method for your needs. Note, however, that these results are not absolute and may vary depending on the test environment and specific use case.

As part of the research, we measured the time spent executing INSERT, UPDATE, and DELETE commands across 1000 records. The batch size is 1000.

Refer to the chart below for performance results of batch operations executed using the methods outlined in this article.

Comparison of PostgreSQL batch operation methods performance

Let us now review the results represented in the chart.

  • The first chart bar, referred to as the Simple method, presents the results after executing individual DML operations in a loop. Note that we intentionally used this approach to demonstrate the poor performance of operations executed in a loop.
  • The next bar, labeled as the In transaction method, shows the results after executing individual DML operations in a loop but within the context of a transaction. This method significantly enhances performance.
  • The DataAdapter.Update() bar illustrates the performance of the ADO.NET DataAdapter to batch-update records. This method is more efficient than executing individual operations, as it can handle multiple DML operations in a single process.
  • The next bar, labeled PgSqlDataTable.Update(), represents the performance while using the PgSqlDataTable object to update records in bulk. Although it performs better than simple, individual operations, its main drawback is higher memory usage, which can negatively impact performance when working with large datasets.
  • PgSqlCommand.ExecuteArray() proved to be the fastest method for processing batch operations of a single type, such as INSERT, UPDATE, or DELETE statements. However, it does not support executing mixed operation types within the same batch.

To sum up, the results presented in the chart confirm our insights. When you need to execute a specific type of batch operation, such as INSERT, UPDATE, or DELETE, Command.ExecuteArray() is the most time-efficient method. However, for the scenarios where different types of operations must be executed together, DataAdapter.Update() becomes a more practical choice despite its lower performance in single-type batch processing.

Not yet acquainted with dotConnect for PostgreSQL?

Not only does our state-of-the-art dotConnect for PostgreSQL offer optimized components for bulk operations, but it also supports a wide range of database-specific features, including multiple connection modes and protocols, advanced data type handling, and robust script management.

We invite you to try dotConnect for PostgreSQL and explore its powerful capabilities with our 30-day trial. Don't miss out on experiencing the full potential of PostgreSQL applications!

dotConnect for PostgreSQL

Get an enhanced ORM-enabled data provider for PostgreSQL and develop .NET applications working with PostgreSQL data quickly and easily!

Discover the ultimate capabilities of dotConnect for PostgreSQL Download free trial