Introduction
In this article, I bring together the most powerful advanced C# database techniques into a cohesive, multi-layered .NET application. By combining Entity Framework Core (EF Core), Dapper, and ADO.NET, I aim to create a highly performant, maintainable, and scalable data access architecture.
This solution leverages the strengths of each technology and incorporates:
- Compiled queries for optimized EF Core performance
- Table-valued parameters (TVPs) with ADO.NET for efficient bulk operations
- Dapper for ultra-fast, lightweight data access
- Interceptors and logging for observability and debugging
- Benchmarks to quantify the performance improvements
My goal is to produce a GitHub-ready reference project that developers can learn from and adapt to their own enterprise applications, complete with clean architecture, code examples, and detailed explanations.
Why Combine EF Core, Dapper, and ADO.NET?
Each data access approach has its strengths:
- EF Core provides rich change tracking, migrations, and LINQ support.
- Dapper offers raw speed and simplicity for executing SQL directly.
- ADO.NET allows low-level database operations, including TVPs, transactions, and multi-result sets.
By using each where it fits best, I can achieve an optimal balance of developer productivity, performance, and fine-grained control.
Architecture Overview
I structure the project as follows:
- Presentation Layer: ASP.NET Core Web API exposes REST endpoints.
- Application Layer: Business services implement use cases and orchestrate data operations.
- Data Access Layer
- EF Core for entity management and complex relational queries.
- Dapper for fast, read-heavy operations.
- ADO.NET for bulk operations, TVPs, and database-specific optimizations.
- Infrastructure Layer: Logging, interceptors, diagnostics, configuration, and performance measurement.
This modular design ensures separation of concerns, easy testing, and maintainability.
Key Components in Detail
1️⃣ Compiled Queries with EF Core
Compiled queries reduce runtime parsing overhead and significantly improve performance for frequently executed queries.
static readonly Func<AppDbContext, int, Task<Student?>> GetStudentById =
EF.CompileAsyncQuery((AppDbContext ctx, int id) =>
ctx.Students.FirstOrDefault(s => s.Id == id));
I use this pattern for critical reads like lookups by ID or frequent filter queries, especially in reporting APIs.
2️⃣ ADO.NET with Table-Valued Parameters
Table-valued parameters allow passing entire datasets as structured parameters to stored procedures.
var table = new DataTable();
table.Columns.Add("Id", typeof(int));
table.Rows.Add(1);
table.Rows.Add(2);
using var cmd = new SqlCommand("GetStudentsByIds", connection);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.AddWithValue("@Ids", table);
On the SQL Server side, I define a user-defined type (dbo.IdList
) and use it in stored procedures for highly efficient batch operations.
3️⃣ Dapper Integration
For lightweight queries, Dapper shines.
var students = await connection.QueryAsync<Student>(
"SELECT * FROM Students WHERE IsActive = 1");
I typically use Dapper for flat queries, especially in dashboards or read-optimized APIs where change tracking isn't needed.
4️⃣ Logging Interceptors
To improve observability, I add a custom DbCommandInterceptor
.
public class QueryLogger : DbCommandInterceptor
{
public override void ReaderExecuting(DbCommand command, CommandEventData eventData, InterceptionResult<DbDataReader> result)
{
Console.WriteLine($"Executing SQL: {command.CommandText}");
base.ReaderExecuting(command, eventData, result);
}
}
This lets me log executed SQL, timings, and detect potential bottlenecks in real time.
5️⃣ Performance Benchmarks
Benchmarks provide hard data to guide optimizations.
var sw = Stopwatch.StartNew();
await efCoreService.GetStudentById(1);
sw.Stop();
Console.WriteLine($"EF Core: {sw.ElapsedMilliseconds} ms");
sw.Restart();
await dapperService.GetActiveStudents();
sw.Stop();
Console.WriteLine($"Dapper: {sw.ElapsedMilliseconds} ms");
By comparing execution times, I can identify hotspots and decide when to switch between EF Core, Dapper, or ADO.NET.
Additional Enhancements
Connection Pooling
Using DbContext
pooling improves scalability:
builder.Services.AddDbContextPool<AppDbContext>(options =>
options.UseSqlServer(connectionString));
Transaction Management
I coordinate multi-step operations using IDbContextTransaction
or native SqlTransaction
when combining EF Core and ADO.NET/Dapper.
Health Checks and Diagnostics
I added ASP.NET Core health checks to monitor database connectivity:
builder.Services.AddHealthChecks().AddSqlServer(connectionString);
Dependency Injection
I register services cleanly in Program.cs
:
builder.Services.AddScoped<IStudentService, StudentService>();
builder.Services.AddScoped<IStudentRepository, EfCoreStudentRepository>();
builder.Services.AddScoped<IDapperRepository, DapperRepository>();
Expected Outcomes
Approach |
Use Case |
Expected Speed |
EF Core + Compile |
Standard CRUD, rich relationships |
Medium |
Dapper |
Flat, read-heavy queries |
Fast |
ADO.NET + TVP |
Bulk operations, multi-result sets |
Fastest |
By combining the right tool for each use case, I ensure the best balance between speed, flexibility, and maintainability.
GitHub-Ready Features
- ✅ Clean, modular architecture
- ✅ REST API endpoints demonstrating each approach
- ✅ Benchmark results and performance summaries
- ✅ Logging, diagnostics, and health checks
- ✅ Full README with setup, explanation, and trade-offs
I want this project to act as a practical best practices showcase for advanced C# database development.
Example C# Class. Advanced Student Repository
Here’s a complete C# class example that integrates EF Core, Dapper, and ADO.NET within a single repository pattern, following the advanced techniques discussed in the article.
using System.Data;
using System.Data.SqlClient;
using System.Threading.Tasks;
using System.Collections.Generic;
using Dapper;
using Microsoft.EntityFrameworkCore;
public class AdvancedStudentRepository
{
private readonly AppDbContext _context;
private readonly IDbConnection _connection;
public AdvancedStudentRepository(AppDbContext context, string connectionString)
{
_context = context;
_connection = new SqlConnection(connectionString);
}
// EF Core compiled query example
private static readonly Func<AppDbContext, int, Task<Student?>> GetStudentByIdQuery =
EF.CompileAsyncQuery((AppDbContext ctx, int id) =>
ctx.Students.FirstOrDefault(s => s.Id == id));
public Task<Student?> GetStudentByIdEfCoreAsync(int id)
{
return GetStudentByIdQuery(_context, id);
}
// Dapper example for active students
public async Task<IEnumerable<Student>> GetActiveStudentsDapperAsync()
{
string sql = "SELECT * FROM Students WHERE IsActive = 1";
return await _connection.QueryAsync<Student>(sql);
}
// ADO.NET example with table-valued parameter
public async Task<IEnumerable<Student>> GetStudentsByIdsAdoNetAsync(List<int> ids)
{
var table = new DataTable();
table.Columns.Add("Id", typeof(int));
ids.ForEach(id => table.Rows.Add(id));
using var cmd = new SqlCommand("GetStudentsByIds", (SqlConnection)_connection)
{
CommandType = CommandType.StoredProcedure
};
cmd.Parameters.AddWithValue("@Ids", table);
var students = new List<Student>();
await _connection.OpenAsync();
using var reader = await cmd.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
students.Add(new Student
{
Id = reader.GetInt32(reader.GetOrdinal("Id")),
FirstName = reader.GetString(reader.GetOrdinal("FirstName")),
LastName = reader.GetString(reader.GetOrdinal("LastName")),
IsActive = reader.GetBoolean(reader.GetOrdinal("IsActive"))
});
}
await _connection.CloseAsync();
return students;
}
}
Highlights
- ✅ Uses EF Core compiled query for fast lookups
- ✅ Uses Dapper for quick, simple queries
- ✅ Uses ADO.NET for advanced bulk operations with TVPs
- ✅ Follows repository pattern for clean separation of concerns
Conclusion
By combining compiled EF Core queries, Dapper micro-ORM, and ADO.NET tricks like TVPs, I can build advanced, high-performance .NET applications that balance flexibility, speed, and maintainability. Adding structured logging, interceptors, and performance benchmarks ensures that I can observe, tune, and evolve the system confidently.