EF Core Bulk Insert and Update: Best Practices and Examples
If you have ever handled large-scale database operations, you will agree that doing this row by row with the EF Core default method can become a performance bottleneck. Here is why.
Slow operations, excessive memory usage, and high database load are all symptoms of inefficient bulk operations. In production systems, these issues can lead to loss of time and increased operational cost.
That's where bulk insert and update strategies come in. These strategies involve processing large volumes of data in batches. By following this approach, you experience a significant reduction in the number of round trips to the database and save more time when processing large datasets. Also, your application remains responsive even under heavy loads.
In this guide, you will learn about the best practices for EF Core bulk insert and update, walk through real-world examples, and see exactly how to implement these operations with dotConnect to achieve maximum efficiency.
What is dotConnect?
Designed by Devart, dotConnect is a high-performance ADO.NET provider that extends EF Core's capabilities with advanced database access features. It works seamlessly with PostgreSQL, Oracle, MySQL, DB2, and SQLite, offering both EF Core integration and direct database access for maximum flexibility. Additionally, dotConnect also works with other CRM, ERP, and accounting services, such as Dynamics 365, Salesforce, Zoho CRM, etc.
Prerequisites and setup
Here are the tools and requirements you need to effectively implement the best practices explained in this guide:
- VS Code: The code editor to write and manage your EF Core bulk insert/update code.
- .NET 9 project: The actual runtime and framework your project will use.
- PostgreSQL database: A live database where you'll perform bulk operations.
- dotConnect for PostgreSQL (NuGet - Devart.Data.PostgreSQL.EFCore): An ADO.NET and EF Core provider from Devart that enables high-performance database connectivity and advanced features for bulk operations in EF Core.
Once you have everything in place, let's proceed.
Bulk insert methods
When performing bulk inserts with EF Core and dotConnect, the method you choose can affect the operation performance. Here are some common bulk insert methods, along with the details on when and why to use each of them.
Basic insert with Add() and SaveChangesAsync()
This method is ideal for inserting a single record or small amounts of data. However, if you are working with a large database, do not use this method because each insert is executed as a separate command. See the code below for how to use this bulk insert method:
using var context = new AppDbContext();
var product = new Product
{
Name = "Camera",
Price = 299.99m,
Stock = 50
};
context.Products.Add(product);
await context.SaveChangesAsync();
Batch insert with AddRange() and MaxBatchSize
If you are performing medium-sized inserts (with hundreds of records), a batch insert with AddRange() and MaxBatchSize is the ideal method to use. This approach improves performance by grouping insert statements into a single database round trip. Here is an example of how it works:
using var context = new AppDbContext();
var products = Enumerable.Range(1, 100).Select(i => new Product
{
Name = $"Product {i}",
Price = 10 + i,
Stock = 100 + i
}).ToList();
context.Products.AddRange(products);
await context.SaveChangesAsync();
To further enhance performance, you may configure the batch size globally. This technique controls memory and transaction limits. Below is an example of how to set batching globally in OnConfiguring():
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UsePostgreSql(
"Host=localhost;Database=shop;Username=postgres;Password=yourpassword",
b => b.MaxBatchSize(50));
}
Disable change tracking for performance (optional)
When you are inserting large numbers of rows, EF Core's change tracking can slow the process down. One of the most effective techniques that helps you avoid this performance bottleneck is disabling this feature temporarily. The code below shows how to do this:
using var context = new AppDbContext();
context.ChangeTracker.AutoDetectChangesEnabled = false;
var bulkProducts = Enumerable.Range(1, 1000).Select(i => new Product
{
Name = $"Bulk Product {i}",
Price = 5 + i,
Stock = 50 + i
}).ToList();
await context.Products.AddRangeAsync(bulkProducts);
await context.SaveChangesAsync();
Insert with navigation properties (cascade insert)
This method enables EF Core to insert related entities automatically. It is ideal when you want EF Core to handle relational inserts. For an example of how to use this method, see the code sample below:
var category = new Category
{
Name = "Electronics",
Products = new List<Product>
{
new Product { Name = "TV", Price = 1200 },
new Product { Name = "Laptop", Price = 1500 }
}
};
context.Categories.Add(category);
await context.SaveChangesAsync();
Insert with transaction scope
If you are working with multi-table or dependent insert operations, where data integrity is critical, this approach is the best solution. An insert with transaction wraps multiple inserts in a single atomic transaction to ensure all either succeed or fail. See the code below on how to use this method:
using var context = new AppDbContext();
using var transaction = await context.Database.BeginTransactionAsync();
context.Products.Add(new Product { Name = "One", Price = 10 });
context.Products.Add(new Product { Name = "Two", Price = 20 });
await context.SaveChangesAsync();
await transaction.CommitAsync();
Insert and retrieve identity key (RETURNING)
EF Core (and dotConnect) automatically retrieves generated primary keys after inserts. As a result, querying them manually is not necessary. Here is a code sample showing how this process works:
var product = new Product { Name = "Mic", Price = 45.5m };
context.Products.Add(product);
await context.SaveChangesAsync();
Console.WriteLine($"New product ID: {product.Id}");
Recommendations:
- For small inserts, use Add().
- For moderate bulk inserts, use AddRange() with batching.
- For very large datasets, disable change tracking and tune batch size.
- dotConnect's optimizations ensure these operations are as fast as possible while keeping EF Core's familiar workflow.
Summary of insertion methods
| Method | Purpose |
|---|---|
| Add() / AddRange() | Performs a standard insert |
| MaxBatchSize() | Enables insert batching |
| AutoDetectChangesEnabled = false | Boosts performance for large inserts |
| Navigation properties | Automatically inserts related entities |
| Transactions | Wraps inserts in a consistent unit of work |
| Identity key retrieval | Ensures that ID is available immediately after insert |
Bulk update methods
Performance also plays a key role in bulk updates, and there are different ways to maintain it. However, when choosing a method, consider the size of your dataset and whether you need to load entities into memory. Below are common bulk update methods with details on when to use them and why.
Basic update with EF Core and dotConnect
This update approach is the easiest to use. It loads entities, modifies them, and calls SaveChangesAsync(). It is ideal for small datasets where business logic requires full entity loading, but can be inefficient with thousands of rows due to the following issues:
- All matching rows are loaded into memory.
- One UPDATE statement is sent per entity.
Notwithstanding, if you are working with a small dataset, this method comes in handy. Here is an example code of this approach:
using var context = new AppDbContext();
var productsToUpdate = await context.Products
.Where(p => p.Stock < 100)
.ToListAsync();
foreach (var product in productsToUpdate)
{
product.Stock += 50;
}
await context.SaveChangesAsync();
Server-side bulk update with ExecuteUpdateAsync() (EF Core 7+)
This update is performed directly in the database without loading entities into memory. It issues a single SQL UPDATE statement and is very fast. Here is an example of how it works:
using var context = new AppDbContext();
await context.Products
.Where(p => p.Stock < 100)
.ExecuteUpdateAsync(updates => updates
.SetProperty(p => p.Stock, p => p.Stock + 50));
Behind the scenes, this query generates the following SQL:
UPDATE "Products" SET "Stock" = "Stock" + 50 WHERE "Stock" < 100;
Key benefits of this method:
- The server-side bulk update with ExecuteUpdateAsync() (EF Core 7+) uses a single SQL command and minimal network overhead.
- It is extremely fast for large datasets.
- This method is fully supported by dotConnect for PostgreSQL (EF Core 7+).
Batch update with MaxBatchSize (optional)
Although entity updates, described above, work well for small datasets, you can still improve performance by configuring batching. Here is how to do this:
optionsBuilder.UsePostgreSql(connStr, b => b.MaxBatchSize(100));
Using this method allows EF Core to send up to 100 UPDATE commands in one round trip (instead of sending them one by one).
Conditional field updates
This method updates only certain fields based on EF Core's conditional logic. It is especially useful for changing only the required fields of a database. Here is how this method works:
await context.Products
.Where(p => p.Name.Contains("Old"))
.ExecuteUpdateAsync(updates => updates
.SetProperty(p => p.Name, p => p.Name.Replace("Old", "New"))
.SetProperty(p => p.LastUpdated, _ => DateTime.UtcNow));
The above code is translated to a single SQL UPDATE statement, leveraging PostgreSQL string functions.
Bulk update inside a transaction
This method wraps updates in a transaction to ensure atomicity. It is best used in critical operations, where partial updates could cause data inconsistencies. Here is an example of how to use it:
using var context = new AppDbContext();
using var transaction = await context.Database.BeginTransactionAsync();
await context.Products
.Where(p => p.Stock > 500)
.ExecuteUpdateAsync(updates =>
updates.SetProperty(p => p.Stock, p => 500));
await transaction.CommitAsync();
Updating related entities
When working with navigation properties, EF Core can cascade updates through loaded relationships. See the code below for how this approach works:
var categories = await context.Categories
.Include(c => c.Products)
.Where(c => c.Name == "Electronics")
.ToListAsync();
foreach (var category in categories)
{
foreach (var product in category.Products)
{
product.Price *= 0.9m; // apply discount
}
}
await context.SaveChangesAsync();
Summary of updating methods
| Method | SQL round trips | Memory usage | Speed | Notes |
|---|---|---|---|---|
| ToListAsync() + loop + SaveChanges | Many | High | Slow | One query per entity unless batched |
| ExecuteUpdateAsync() | One | Low | Fast | Ideal for most bulk updates |
| MaxBatchSize() | Fewer | Medium | Medium | Helps if ExecuteUpdateAsync is not used |
| Transaction wrapper | One or many | Low-high | Safe | Ensures ACID consistency |
| Navigation property update | Many | High | Slow | Use only when relationship context needed |
Recommendations:
- For maximum performance, use ExecuteUpdateAsync() whenever possible. It is the most efficient method.
- For smaller sets or complex logic, use classic ToList() and per-entity updates.
- Enable MaxBatchSize to reduce round trips in older EF Core versions or entity-based loops.
- dotConnect for PostgreSQL fully supports all of the above techniques as long as you are using EF Core 7+ for ExecuteUpdateAsync.
dotConnect advantages with EF Core
With dotConnect, you get the ease and productivity of EF Core, without its common bottlenecks when handling large datasets. Here are some of the advantages that make dotConnect a go-to solution.
Performance enhancements
dotConnect is built for speed, especially in bulk operations. It enhances performance by allowing you to handle the following actions seamlessly:
- Optimized SQL execution: Reduces the number of database round trips by batching commands.
- Direct mode: Connects to Oracle, MySQL, and PostgreSQL without going through client libraries, thus lowering latency.
- Efficient memory usage: Processes large datasets without loading unnecessary tracking data.
- Server-side operations: Leverages EF Core 7+ features like ExecuteUpdateAsync() with full support.
Advanced features
dotConnect also provides the following advanced features to help improve your application's efficiency:
- MaxBatchSize control: Fine-tune how many commands are sent in each batch.
- Advanced type mapping: Handle complex entity properties with ease.
- Better integration with PostgreSQL features: Include arrays, JSONB, and RETURNING clauses (handled automatically).
- Full async support:Benefit from non-blocking performance even with high data volume.
Comparison with EF Core native capabilities
| Feature | EF Core native | EF Core + dotConnect |
|---|---|---|
| Bulk insert | Possible, but limited performance on large datasets | Optimized with batching, direct mode, and memory management |
| Bulk update | Supported in EF Core 7+ | Fully supported with performance boosts |
| MaxBatchSize | Limited to provider defaults | Fully configurable |
| PostgreSQL-specific features | Partial support | Extended support for arrays, JSONB, RETURNING |
| Connection overhead | Uses standard ADO.NET providers | Can bypass client libraries for faster access |
Code sample: bulk insert with dotConnect
using var context = new AppDbContext();
var products = Enumerable.Range(1, 10000).Select(i => new Product
{
Name = $"Product {i}",
Price = 10 + i,
Stock = 100 + i
}).ToList();
context.ChangeTracker.AutoDetectChangesEnabled = false;
await context.Products.AddRangeAsync(products);
await context.SaveChangesAsync();
With dotConnect, this insert runs significantly faster, because it batches operations efficiently and minimizes EF Core's change tracking overhead.
When should you use dotConnect?
Use dotConnect in any of the following situations:
- You need faster bulk inserts or updates without rewriting your EF Core code.
- You work with PostgreSQL-specific features like arrays or JSONB.
- You require direct, low-latency connections to your database.
- You want fine-grained control over batching, transactions, and execution strategies.
Conclusion
EF Core provides a fantastic developer experience, but when using its default method for handling large datasets, performance can quickly become a bottleneck. However, dotConnect removes these barriers and provides a better and more efficient way of handling large-scale tasks. Combining EF Core's familiar API with the raw speed and flexibility of a tuned ADO.NET provider accelerates your application performance.
If you want to work with EF Core without the wait times, especially in data-heavy applications, dotConnect is the right tool for the job.