![performance]()
Previous article: ASP.NET Core Cloud Domination: Master Azure & AWS Global Scaling, Storage & Hosting | Production Guide (Part-25 of 40)
Table of Contents
Introduction to ASP.NET Core Performance
Understanding Performance Metrics
Async/Await Deep Dive
Entity Framework Core Performance
Caching Strategies
Response Compression
Database Optimization
Memory Management
Profiling Tools
Real-World Case Study
Advanced Performance Patterns
Monitoring and Diagnostics
Conclusion
1. Introduction to ASP.NET Core Performance
Performance is the cornerstone of modern web applications. In today's competitive digital landscape, users expect blazing-fast responses and seamless experiences. ASP.NET Core provides a robust foundation for building high-performance web applications, but mastering its performance capabilities requires a deep understanding and strategic implementation.
Why Performance Matters
User Experience: 53% of mobile users abandon sites that take longer than 3 seconds to load
Conversion Rates: A 1-second delay in page response can result in a 7% reduction in conversions
SEO Impact: Google uses page speed as a ranking factor in search results
Infrastructure Costs: Optimized applications require fewer server resources
The Performance Mindset
// Bad: Synchronous database call in controller
public IActionResult GetUser(int id)
{
var user = _context.Users.Find(id); // Blocks thread
return View(user);
}
// Good: Asynchronous database call
public async Task<IActionResult> GetUserAsync(int id)
{
var user = await _context.Users.FindAsync(id); // Non-blocking
return View(user);
}
2. Understanding Performance Metrics
Key Performance Indicators
public class PerformanceMetrics
{
// Response Time
public TimeSpan AverageResponseTime { get; set; }
public TimeSpan P95ResponseTime { get; set; }
// Throughput
public int RequestsPerSecond { get; set; }
public int ConcurrentUsers { get; set; }
// Resource Utilization
public double CpuUsage { get; set; }
public long MemoryUsage { get; set; }
public int ThreadPoolThreads { get; set; }
// Database Metrics
public int DatabaseQueriesPerRequest { get; set; }
public TimeSpan DatabaseQueryTime { get; set; }
}
Implementing Custom Metrics Middleware
public class PerformanceMonitoringMiddleware
{
private readonly RequestDelegate _next;
private readonly ILogger<PerformanceMonitoringMiddleware> _logger;
public PerformanceMonitoringMiddleware(
RequestDelegate next,
ILogger<PerformanceMonitoringMiddleware> logger)
{
_next = next;
_logger = logger;
}
public async Task InvokeAsync(HttpContext context)
{
var stopwatch = Stopwatch.StartNew();
var startMemory = GC.GetTotalMemory(false);
try
{
await _next(context);
}
finally
{
stopwatch.Stop();
var endMemory = GC.GetTotalMemory(false);
var memoryUsed = endMemory - startMemory;
_logger.LogInformation(
"Request: {Method} {Path} completed in {ElapsedMs}ms using {MemoryBytes} bytes",
context.Request.Method,
context.Request.Path,
stopwatch.ElapsedMilliseconds,
memoryUsed);
// Log slow requests
if (stopwatch.ElapsedMilliseconds > 1000) // 1 second threshold
{
_logger.LogWarning(
"SLOW REQUEST: {Method} {Path} took {ElapsedMs}ms",
context.Request.Method,
context.Request.Path,
stopwatch.ElapsedMilliseconds);
}
}
}
}
// Register in Program.cs
app.UseMiddleware<PerformanceMonitoringMiddleware>();
3. Async/Await Deep Dive
Understanding the Async/Await Pattern
public class OrderService
{
private readonly IOrderRepository _orderRepository;
private readonly IPaymentService _paymentService;
private readonly IEmailService _emailService;
public OrderService(
IOrderRepository orderRepository,
IPaymentService paymentService,
IEmailService emailService)
{
_orderRepository = orderRepository;
_paymentService = paymentService;
_emailService = emailService;
}
// Synchronous version - AVOID THIS
public OrderResult ProcessOrderSync(OrderRequest request)
{
// 1. Validate order (CPU-bound)
var validationResult = ValidateOrder(request);
if (!validationResult.IsValid)
return OrderResult.Failure(validationResult.Errors);
// 2. Process payment (I/O-bound - BLOCKING!)
var paymentResult = _paymentService.ProcessPayment(request.Payment);
if (!paymentResult.Success)
return OrderResult.Failure("Payment failed");
// 3. Create order in database (I/O-bound - BLOCKING!)
var order = _orderRepository.Create(request);
// 4. Send confirmation email (I/O-bound - BLOCKING!)
_emailService.SendConfirmation(order);
return OrderResult.Success(order);
}
// Asynchronous version - USE THIS
public async Task<OrderResult> ProcessOrderAsync(OrderRequest request)
{
// 1. Validate order (still synchronous - CPU-bound)
var validationResult = ValidateOrder(request);
if (!validationResult.IsValid)
return OrderResult.Failure(validationResult.Errors);
// 2. Process payment (I/O-bound - NON-BLOCKING!)
var paymentResult = await _paymentService.ProcessPaymentAsync(request.Payment);
if (!paymentResult.Success)
return OrderResult.Failure("Payment failed");
// 3. Create order in database (I/O-bound - NON-BLOCKING!)
var order = await _orderRepository.CreateAsync(request);
// 4. Send confirmation email (I/O-bound - NON-BLOCKING!)
// Fire and forget - don't await if not critical
_ = Task.Run(() => _emailService.SendConfirmationAsync(order));
return OrderResult.Success(order);
}
// Advanced: Parallel async operations
public async Task<OrderResult> ProcessOrderParallelAsync(OrderRequest request)
{
var validationResult = ValidateOrder(request);
if (!validationResult.IsValid)
return OrderResult.Failure(validationResult.Errors);
// Start multiple async operations
var paymentTask = _paymentService.ProcessPaymentAsync(request.Payment);
var inventoryTask = _inventoryService.ReserveItemsAsync(request.Items);
// Wait for all to complete
await Task.WhenAll(paymentTask, inventoryTask);
if (!paymentTask.Result.Success)
return OrderResult.Failure("Payment failed");
if (!inventoryTask.Result.Success)
return OrderResult.Failure("Inventory reservation failed");
var order = await _orderRepository.CreateAsync(request);
return OrderResult.Success(order);
}
}
Common Async/Await Pitfalls and Solutions
public class AsyncAntiPatterns
{
// 1. Async Void - AVOID (except event handlers)
public async void BadMethodAsync()
{
await Task.Delay(1000);
// Exceptions can't be caught properly!
}
// 2. Blocking Async Code - AVOID
public void BadBlocking()
{
var result = SomeAsyncMethod().Result; // Can cause deadlocks
var result2 = SomeAsyncMethod().GetAwaiter().GetResult(); // Still bad
}
// 3. ConfigureAwait False - USE CAREFULLY
public async Task GoodAsyncMethod()
{
await SomeAsyncOperation().ConfigureAwait(false);
// After this point, we might not be on the original context
}
// 4. Excessive Async - DON'T OVERUSE
public Task<int> CalculateSyncOperation()
{
// This is CPU-bound work - don't make it async
var result = HeavyCalculation();
return Task.FromResult(result); // Wrap in task if interface requires it
}
// 5. Proper cancellation support
public async Task<string> DownloadWithTimeoutAsync(
string url,
CancellationToken cancellationToken = default)
{
using var client = new HttpClient();
using var timeoutCts = new CancellationTokenSource(TimeSpan.FromSeconds(30));
using var linkedCts = CancellationTokenSource.CreateLinkedTokenSource(
cancellationToken, timeoutCts.Token);
try
{
var response = await client.GetAsync(url, linkedCts.Token);
return await response.Content.ReadAsStringAsync();
}
catch (OperationCanceledException)
{
throw new TimeoutException("Download timed out");
}
}
}
Real-World Async Controller Example
[ApiController]
[Route("api/[controller]")]
public class ProductsController : ControllerBase
{
private readonly IProductService _productService;
private readonly IImageService _imageService;
private readonly ICacheService _cacheService;
public ProductsController(
IProductService productService,
IImageService imageService,
ICacheService cacheService)
{
_productService = productService;
_imageService = imageService;
_cacheService = cacheService;
}
[HttpGet("{id}")]
public async Task<IActionResult> GetProduct(int id)
{
// Cache-aside pattern with async
var cacheKey = $"product_{id}";
var product = await _cacheService.GetAsync<Product>(cacheKey);
if (product == null)
{
product = await _productService.GetProductAsync(id);
if (product != null)
{
await _cacheService.SetAsync(cacheKey, product, TimeSpan.FromMinutes(30));
}
}
return product != null ? Ok(product) : NotFound();
}
[HttpPost]
public async Task<IActionResult> CreateProduct([FromForm] ProductCreateRequest request)
{
// Process image upload asynchronously
if (request.Image != null)
{
var imageUrl = await _imageService.UploadImageAsync(request.Image);
request.ImageUrl = imageUrl;
}
// Save product to database
var product = await _productService.CreateProductAsync(request);
// Invalidate related cache entries
await _cacheService.RemoveAsync("products_list");
await _cacheService.RemoveAsync($"category_{product.CategoryId}_products");
return CreatedAtAction(nameof(GetProduct), new { id = product.Id }, product);
}
[HttpGet("search")]
public async Task<IActionResult> SearchProducts(
[FromQuery] string query,
[FromQuery] int page = 1,
[FromQuery] int pageSize = 20,
CancellationToken cancellationToken = default)
{
// Parallel async operations for better performance
var searchTask = _productService.SearchProductsAsync(query, page, pageSize, cancellationToken);
var filtersTask = _productService.GetSearchFiltersAsync(query, cancellationToken);
await Task.WhenAll(searchTask, filtersTask);
var result = new SearchResult
{
Products = await searchTask,
Filters = await filtersTask,
Page = page,
TotalResults = await _productService.GetSearchCountAsync(query, cancellationToken)
};
return Ok(result);
}
}
4. Entity Framework Core Performance
Efficient Query Patterns
public class EfficientProductService
{
private readonly ApplicationDbContext _context;
public EfficientProductService(ApplicationDbContext context)
{
_context = context;
}
// BAD: N+1 Query Problem
public async Task<List<ProductDto>> GetProductsBadAsync(int categoryId)
{
var products = await _context.Products
.Where(p => p.CategoryId == categoryId)
.ToListAsync();
var result = new List<ProductDto>();
foreach (var product in products)
{
// BAD: Separate query for each product's category
var category = await _context.Categories
.FirstOrDefaultAsync(c => c.Id == product.CategoryId);
result.Add(new ProductDto
{
Id = product.Id,
Name = product.Name,
CategoryName = category.Name // N+1 problem!
});
}
return result;
}
// GOOD: Eager Loading with Include
public async Task<List<ProductDto>> GetProductsGoodAsync(int categoryId)
{
var products = await _context.Products
.Where(p => p.CategoryId == categoryId)
.Include(p => p.Category) // Single query with JOIN
.Include(p => p.Supplier)
.Select(p => new ProductDto
{
Id = p.Id,
Name = p.Name,
CategoryName = p.Category.Name, // Already loaded
SupplierName = p.Supplier.Name,
Price = p.Price
})
.AsNoTracking() // Read-only query - no change tracking
.ToListAsync();
return products;
}
// BETTER: Projection (Select) only needed fields
public async Task<List<ProductSummaryDto>> GetProductsProjectionAsync(int categoryId)
{
return await _context.Products
.Where(p => p.CategoryId == categoryId)
.Select(p => new ProductSummaryDto
{
Id = p.Id,
Name = p.Name,
Price = p.Price,
CategoryName = p.Category.Name,
IsInStock = p.StockQuantity > 0
})
.AsNoTracking()
.ToListAsync();
}
// EXCELLENT: Compiled Queries for frequently used queries
private static readonly Func<ApplicationDbContext, int, Task<List<ProductSummaryDto>>>
GetProductsByCategoryQuery =
EF.CompileAsyncQuery(
(ApplicationDbContext context, int categoryId) =>
context.Products
.Where(p => p.CategoryId == categoryId)
.Select(p => new ProductSummaryDto
{
Id = p.Id,
Name = p.Name,
Price = p.Price,
CategoryName = p.Category.Name
})
.AsNoTracking()
.ToList());
public Task<List<ProductSummaryDto>> GetProductsCompiledAsync(int categoryId)
{
return GetProductsByCategoryQuery(_context, categoryId);
}
}
Bulk Operations and Batch Processing
public class BulkOperationsService
{
private readonly ApplicationDbContext _context;
public BulkOperationsService(ApplicationDbContext context)
{
_context = context;
}
// BAD: Individual inserts - SLOW
public async Task ImportProductsBadAsync(List<ProductImportDto> products)
{
foreach (var productDto in products)
{
var product = new Product
{
Name = productDto.Name,
Price = productDto.Price,
CategoryId = productDto.CategoryId,
CreatedAt = DateTime.UtcNow
};
_context.Products.Add(product);
await _context.SaveChangesAsync(); // BAD: Save after each insert
}
}
// GOOD: Batch insert
public async Task ImportProductsGoodAsync(List<ProductImportDto> products)
{
var entities = products.Select(p => new Product
{
Name = p.Name,
Price = p.Price,
CategoryId = p.CategoryId,
CreatedAt = DateTime.UtcNow
}).ToList();
_context.Products.AddRange(entities);
await _context.SaveChangesAsync(); // Single database roundtrip
}
// BETTER: Bulk Extensions for large datasets
public async Task ImportProductsBulkAsync(List<ProductImportDto> products)
{
var entities = products.Select(p => new Product
{
Name = p.Name,
Price = p.Price,
CategoryId = p.CategoryId,
CreatedAt = DateTime.UtcNow
}).ToList();
// Using third-party bulk extensions (e.g., EFCore.BulkExtensions)
await _context.BulkInsertAsync(entities, new BulkConfig
{
BatchSize = 1000,
UseTempDB = true
});
}
// Efficient bulk updates
public async Task UpdateProductPricesAsync(int categoryId, decimal percentageIncrease)
{
// Single UPDATE statement
await _context.Products
.Where(p => p.CategoryId == categoryId)
.ExecuteUpdateAsync(p => p
.SetProperty(x => x.Price, x => x.Price * (1 + percentageIncrease / 100))
.SetProperty(x => x.UpdatedAt, DateTime.UtcNow));
}
// Efficient bulk deletes
public async Task DeleteOldProductsAsync(DateTime olderThan)
{
// Single DELETE statement
await _context.Products
.Where(p => p.CreatedAt < olderThan)
.ExecuteDeleteAsync();
}
}
DbContext Pooling and Configuration
public static class DatabaseConfiguration
{
public static void ConfigureDatabase(IServiceCollection services, IConfiguration configuration)
{
// Regular DbContext registration
// services.AddDbContext<ApplicationDbContext>(options =>
// options.UseSqlServer(configuration.GetConnectionString("DefaultConnection")));
// DbContext Pooling for better performance
services.AddDbContextPool<ApplicationDbContext>(options =>
{
options.UseSqlServer(
configuration.GetConnectionString("DefaultConnection"),
sqlOptions =>
{
sqlOptions.EnableRetryOnFailure(
maxRetryCount: 3,
maxRetryDelay: TimeSpan.FromSeconds(5),
errorNumbersToAdd: null);
sqlOptions.CommandTimeout(30);
});
// Enable sensitive data logging only in development
if (Environment.IsDevelopment())
{
options.EnableSensitiveDataLogging();
options.EnableDetailedErrors();
}
});
// Query splitting configuration for complex queries
services.AddDbContext<ApplicationDbContext>(options =>
{
options.UseSqlServer(
configuration.GetConnectionString("DefaultConnection"),
options => options.UseQuerySplittingBehavior(QuerySplittingBehavior.SplitQuery));
});
}
}
// Optimized DbContext design
public class ApplicationDbContext : DbContext
{
public ApplicationDbContext(DbContextOptions<ApplicationDbContext> options)
: base(options)
{
// Performance optimizations
ChangeTracker.QueryTrackingBehavior = QueryTrackingBehavior.NoTracking;
ChangeTracker.AutoDetectChangesEnabled = false;
}
public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = default)
{
// Auto-detect changes before save
ChangeTracker.DetectChanges();
// Set audit fields
var entries = ChangeTracker.Entries()
.Where(e => e.Entity is IAuditable &&
(e.State == EntityState.Added || e.State == EntityState.Modified));
foreach (var entry in entries)
{
var entity = (IAuditable)entry.Entity;
if (entry.State == EntityState.Added)
{
entity.CreatedAt = DateTime.UtcNow;
entity.CreatedBy = "system"; // Get from current user in real scenario
}
entity.UpdatedAt = DateTime.UtcNow;
entity.UpdatedBy = "system";
}
return await base.SaveChangesAsync(cancellationToken);
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
// Configure indexes for better query performance
modelBuilder.Entity<Product>()
.HasIndex(p => p.CategoryId)
.IncludeProperties(p => new { p.Name, p.Price, p.IsActive });
modelBuilder.Entity<Product>()
.HasIndex(p => new { p.Name, p.IsActive });
// Configure decimal precision
modelBuilder.Entity<Product>()
.Property(p => p.Price)
.HasPrecision(18, 2);
// Configure relationships and delete behavior
modelBuilder.Entity<Product>()
.HasOne(p => p.Category)
.WithMany(c => c.Products)
.HasForeignKey(p => p.CategoryId)
.OnDelete(DeleteBehavior.Restrict);
}
}
5. Caching Strategies
Multi-Level Caching Implementation
public interface ICacheService
{
Task<T> GetAsync<T>(string key);
Task SetAsync<T>(string key, T value, TimeSpan? expiry = null);
Task RemoveAsync(string key);
Task<bool> ExistsAsync(string key);
}
public class DistributedCacheService : ICacheService
{
private readonly IDistributedCache _cache;
private readonly ILogger<DistributedCacheService> _logger;
public DistributedCacheService(
IDistributedCache cache,
ILogger<DistributedCacheService> logger)
{
_cache = cache;
_logger = logger;
}
public async Task<T> GetAsync<T>(string key)
{
try
{
var cachedData = await _cache.GetStringAsync(key);
if (cachedData == null)
{
_logger.LogDebug("Cache miss for key: {Key}", key);
return default;
}
_logger.LogDebug("Cache hit for key: {Key}", key);
return JsonSerializer.Deserialize<T>(cachedData);
}
catch (Exception ex)
{
_logger.LogError(ex, "Error reading from cache for key: {Key}", key);
return default; // Fail gracefully
}
}
public async Task SetAsync<T>(string key, T value, TimeSpan? expiry = null)
{
try
{
var serializedData = JsonSerializer.Serialize(value);
var options = new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = expiry ?? TimeSpan.FromMinutes(30)
};
await _cache.SetStringAsync(key, serializedData, options);
_logger.LogDebug("Cache set for key: {Key} with expiry: {Expiry}", key, expiry);
}
catch (Exception ex)
{
_logger.LogError(ex, "Error writing to cache for key: {Key}", key);
}
}
public async Task RemoveAsync(string key)
{
try
{
await _cache.RemoveAsync(key);
_logger.LogDebug("Cache removed for key: {Key}", key);
}
catch (Exception ex)
{
_logger.LogError(ex, "Error removing from cache for key: {Key}", key);
}
}
public async Task<bool> ExistsAsync(string key)
{
var data = await _cache.GetStringAsync(key);
return data != null;
}
}
// Advanced caching with memory cache fallback
public class MultiLevelCacheService : ICacheService
{
private readonly IDistributedCache _distributedCache;
private readonly IMemoryCache _memoryCache;
private readonly ILogger<MultiLevelCacheService> _logger;
public MultiLevelCacheService(
IDistributedCache distributedCache,
IMemoryCache memoryCache,
ILogger<MultiLevelCacheService> logger)
{
_distributedCache = distributedCache;
_memoryCache = memoryCache;
_logger = logger;
}
public async Task<T> GetAsync<T>(string key)
{
// First, try memory cache (L1)
if (_memoryCache.TryGetValue(key, out T memoryCachedValue))
{
_logger.LogDebug("L1 cache hit for key: {Key}", key);
return memoryCachedValue;
}
// Then, try distributed cache (L2)
try
{
var distributedValue = await _distributedCache.GetStringAsync(key);
if (distributedValue != null)
{
var value = JsonSerializer.Deserialize<T>(distributedValue);
// Populate L1 cache
_memoryCache.Set(key, value, TimeSpan.FromMinutes(5));
_logger.LogDebug("L2 cache hit for key: {Key}", key);
return value;
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Error reading from distributed cache for key: {Key}", key);
}
_logger.LogDebug("Cache miss for key: {Key}", key);
return default;
}
public async Task SetAsync<T>(string key, T value, TimeSpan? expiry = null)
{
var actualExpiry = expiry ?? TimeSpan.FromMinutes(30);
// Set in memory cache with shorter expiry
_memoryCache.Set(key, value, TimeSpan.FromMinutes(5));
// Set in distributed cache
try
{
var serializedData = JsonSerializer.Serialize(value);
var options = new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = actualExpiry
};
await _distributedCache.SetStringAsync(key, serializedData, options);
}
catch (Exception ex)
{
_logger.LogError(ex, "Error writing to distributed cache for key: {Key}", key);
}
}
public async Task RemoveAsync(string key)
{
// Remove from both caches
_memoryCache.Remove(key);
try
{
await _distributedCache.RemoveAsync(key);
}
catch (Exception ex)
{
_logger.LogError(ex, "Error removing from distributed cache for key: {Key}", key);
}
}
public async Task<bool> ExistsAsync(string key)
{
return _memoryCache.TryGetValue(key, out _) ||
await _distributedCache.GetStringAsync(key) != null;
}
}
Cache-Aside Pattern with Stampede Protection
public class CacheAsideService
{
private readonly ICacheService _cache;
private readonly ILogger<CacheAsideService> _logger;
private readonly ConcurrentDictionary<string, SemaphoreSlim> _locks;
public CacheAsideService(ICacheService cache, ILogger<CacheAsideService> logger)
{
_cache = cache;
_logger = logger;
_locks = new ConcurrentDictionary<string, SemaphoreSlim>();
}
public async Task<T> GetOrSetAsync<T>(
string key,
Func<Task<T>> factory,
TimeSpan? expiry = null)
{
// Try to get from cache first
var cachedValue = await _cache.GetAsync<T>(key);
if (cachedValue != null)
{
return cachedValue;
}
// Cache miss - acquire lock for this key to prevent cache stampede
var lockKey = $"{key}_lock";
var lockObj = _locks.GetOrAdd(lockKey, _ => new SemaphoreSlim(1, 1));
await lockObj.WaitAsync();
try
{
// Double-check after acquiring lock
cachedValue = await _cache.GetAsync<T>(key);
if (cachedValue != null)
{
return cachedValue;
}
// Generate value using factory
_logger.LogInformation("Cache miss - generating value for key: {Key}", key);
var value = await factory();
// Set in cache
if (value != null)
{
await _cache.SetAsync(key, value, expiry);
}
return value;
}
finally
{
lockObj.Release();
_locks.TryRemove(lockKey, out _);
}
}
}
// Real-world usage example
public class ProductServiceWithCache
{
private readonly ApplicationDbContext _context;
private readonly CacheAsideService _cacheService;
public ProductServiceWithCache(
ApplicationDbContext context,
CacheAsideService cacheService)
{
_context = context;
_cacheService = cacheService;
}
public async Task<Product> GetProductAsync(int id)
{
var cacheKey = $"product_{id}";
return await _cacheService.GetOrSetAsync(
cacheKey,
async () => await _context.Products
.Include(p => p.Category)
.Include(p => p.Supplier)
.AsNoTracking()
.FirstOrDefaultAsync(p => p.Id == id),
TimeSpan.FromMinutes(30));
}
public async Task<List<Product>> GetProductsByCategoryAsync(int categoryId)
{
var cacheKey = $"products_category_{categoryId}";
return await _cacheService.GetOrSetAsync(
cacheKey,
async () => await _context.Products
.Where(p => p.CategoryId == categoryId && p.IsActive)
.Include(p => p.Category)
.AsNoTracking()
.ToListAsync(),
TimeSpan.FromMinutes(15));
}
}
Response Caching Middleware
// Custom response cache attribute
[AttributeUsage(AttributeTargets.Class | AttributeTargets.Method)]
public class CustomResponseCacheAttribute : Attribute, IAsyncActionFilter
{
private readonly int _duration;
private readonly ResponseCacheLocation _location;
private readonly bool _varyByQuery;
public CustomResponseCacheAttribute(
int durationInSeconds,
ResponseCacheLocation location = ResponseCacheLocation.Any,
bool varyByQuery = true)
{
_duration = durationInSeconds;
_location = location;
_varyByQuery = varyByQuery;
}
public async Task OnActionExecutionAsync(
ActionExecutingContext context,
ActionExecutionDelegate next)
{
var executedContext = await next();
if (executedContext.Result is ObjectResult objectResult &&
objectResult.StatusCode == 200)
{
var response = context.HttpContext.Response;
response.GetTypedHeaders().CacheControl = new CacheControlHeaderValue
{
Public = _location == ResponseCacheLocation.Any,
Private = _location == ResponseCacheLocation.Client,
MaxAge = TimeSpan.FromSeconds(_duration),
MustRevalidate = true
};
// Add custom cache headers
response.Headers[HeaderNames.Vary] = _varyByQuery ? "Accept-Encoding,Accept" : "Accept-Encoding";
response.Headers["X-Cache-Info"] = $"max-age={_duration}, public";
}
}
}
// Usage in controllers
[ApiController]
[Route("api/[controller]")]
public class ProductsController : ControllerBase
{
[HttpGet("{id}")]
[CustomResponseCache(300)] // Cache for 5 minutes
public async Task<IActionResult> GetProduct(int id)
{
// Implementation
}
[HttpGet("search")]
[CustomResponseCache(60, varyByQuery: true)] // Cache for 1 minute, vary by query
public async Task<IActionResult> SearchProducts([FromQuery] string query)
{
// Implementation
}
}
// Configure response caching in Program.cs
builder.Services.AddResponseCaching(options =>
{
options.MaximumBodySize = 1024 * 1024; // 1MB
options.UseCaseSensitivePaths = false;
});
app.UseResponseCaching();
6. Response Compression
Compression Middleware Configuration
public static class CompressionConfiguration
{
public static void ConfigureCompression(IServiceCollection services)
{
services.Configure<BrotliCompressionProviderOptions>(options =>
{
options.Level = CompressionLevel.Fastest;
});
services.Configure<GzipCompressionProviderOptions>(options =>
{
options.Level = CompressionLevel.Optimal;
});
services.AddResponseCompression(options =>
{
options.EnableForHttps = true;
options.Providers.Add<BrotliCompressionProvider>();
options.Providers.Add<GzipCompressionProvider>();
// Add custom MIME types
options.MimeTypes = ResponseCompressionDefaults.MimeTypes.Concat(new[]
{
"image/svg+xml",
"application/octet-stream",
"application/wasm"
});
// Skip compression for small responses
options.ExcludedPaths.Add("/small-endpoint");
});
}
}
// Custom compression provider for specific content
public class CustomCompressionProvider : ICompressionProvider
{
private readonly ICompressionProvider _innerProvider;
public CustomCompressionProvider(ICompressionProvider innerProvider)
{
_innerProvider = innerProvider;
}
public Stream CreateStream(Stream outputStream)
{
return _innerProvider.CreateStream(outputStream);
}
public Stream CreateStream(Stream outputStream, CompressionLevel level)
{
return _innerProvider.CreateStream(outputStream, level);
}
}
Dynamic Compression Strategies
public class CompressionMiddleware
{
private readonly RequestDelegate _next;
private readonly ILogger<CompressionMiddleware> _logger;
public CompressionMiddleware(
RequestDelegate next,
ILogger<CompressionMiddleware> logger)
{
_next = next;
_logger = logger;
}
public async Task InvokeAsync(HttpContext context)
{
var originalBodyStream = context.Response.Body;
try
{
// Check if compression should be applied
if (ShouldCompressResponse(context))
{
var compressionProvider = GetCompressionProvider(context);
if (compressionProvider != null)
{
using var memoryStream = new MemoryStream();
context.Response.Body = memoryStream;
await _next(context);
// Check response size and content type
if (ShouldCompressBasedOnContent(context, memoryStream))
{
await CompressResponseAsync(
context, memoryStream, originalBodyStream, compressionProvider);
return;
}
else
{
// Copy uncompressed response
memoryStream.Seek(0, SeekOrigin.Begin);
await memoryStream.CopyToAsync(originalBodyStream);
return;
}
}
}
// Continue without compression
await _next(context);
}
finally
{
context.Response.Body = originalBodyStream;
}
}
private bool ShouldCompressResponse(HttpContext context)
{
// Skip compression for certain paths
var path = context.Request.Path.Value ?? "";
if (path.Contains("/no-compress/") || path.EndsWith(".jpg") || path.EndsWith(".png"))
{
return false;
}
// Check if client supports compression
var acceptEncoding = context.Request.Headers.AcceptEncoding.ToString();
return acceptEncoding.Contains("br") || acceptEncoding.Contains("gzip");
}
private ICompressionProvider GetCompressionProvider(HttpContext context)
{
var acceptEncoding = context.Request.Headers.AcceptEncoding.ToString();
if (acceptEncoding.Contains("br"))
{
return context.RequestServices.GetService<BrotliCompressionProvider>();
}
else if (acceptEncoding.Contains("gzip"))
{
return context.RequestServices.GetService<GzipCompressionProvider>();
}
return null;
}
private bool ShouldCompressBasedOnContent(HttpContext context, MemoryStream stream)
{
// Don't compress very small responses
if (stream.Length < 1024) // 1KB
{
return false;
}
// Check content type
var contentType = context.Response.ContentType ?? "";
var compressibleTypes = new[]
{
"text/plain", "text/html", "text/css", "application/javascript",
"application/json", "application/xml", "text/xml"
};
return compressibleTypes.Any(t => contentType.StartsWith(t));
}
private async Task CompressResponseAsync(
HttpContext context,
MemoryStream sourceStream,
Stream destinationStream,
ICompressionProvider compressionProvider)
{
sourceStream.Seek(0, SeekOrigin.Begin);
using var compressionStream = compressionProvider.CreateStream(destinationStream);
// Set appropriate content encoding header
if (compressionProvider is BrotliCompressionProvider)
{
context.Response.Headers.ContentEncoding = "br";
}
else if (compressionProvider is GzipCompressionProvider)
{
context.Response.Headers.ContentEncoding = "gzip";
}
// Remove content length header since it will change
context.Response.Headers.ContentLength = null;
await sourceStream.CopyToAsync(compressionStream);
await compressionStream.FlushAsync();
_logger.LogDebug(
"Compressed response from {OriginalSize} to {CompressedSize} bytes",
sourceStream.Length, compressionStream.Length);
}
}
7. Database Optimization
Connection Pooling and Management
public class DatabaseConnectionService
{
private readonly IConfiguration _configuration;
private readonly ILogger<DatabaseConnectionService> _logger;
private readonly ConcurrentBag<SqlConnection> _connections;
public DatabaseConnectionService(
IConfiguration configuration,
ILogger<DatabaseConnectionService> logger)
{
_configuration = configuration;
_logger = logger;
_connections = new ConcurrentBag<SqlConnection>();
}
public async Task<SqlConnection> GetOptimizedConnectionAsync()
{
var connectionString = _configuration.GetConnectionString("DefaultConnection");
var connection = new SqlConnection(connectionString);
// Configure connection for performance
await connection.OpenAsync();
// Set connection-level settings
using var setCommand = connection.CreateCommand();
setCommand.CommandText = @"
SET ARITHABORT ON;
SET NUMERIC_ROUNDABORT OFF;
SET CONCAT_NULL_YIELDS_NULL ON;
SET XACT_ABORT ON;
SET TRANSACTION ISOLATION LEVEL READ COMMITTED;";
await setCommand.ExecuteNonQueryAsync();
return connection;
}
// Connection resilience with retry logic
public async Task<T> ExecuteWithRetryAsync<T>(
Func<SqlConnection, Task<T>> operation,
int maxRetries = 3)
{
var retryCount = 0;
var delays = new[] { TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(3), TimeSpan.FromSeconds(5) };
while (true)
{
SqlConnection connection = null;
try
{
connection = await GetOptimizedConnectionAsync();
return await operation(connection);
}
catch (SqlException ex) when (IsTransientError(ex) && retryCount < maxRetries)
{
_logger.LogWarning(ex,
"Transient database error occurred. Retry attempt {RetryCount}",
retryCount + 1);
await Task.Delay(delays[retryCount]);
retryCount++;
}
finally
{
connection?.Close();
connection?.Dispose();
}
}
}
private bool IsTransientError(SqlException ex)
{
// SQL Server transient error numbers
int[] transientErrors = { 4060, 40197, 40501, 40613, 49918, 49919, 49920, 11001 };
return transientErrors.Contains(ex.Number);
}
}
Advanced Query Optimization
public class OptimizedProductRepository
{
private readonly ApplicationDbContext _context;
private readonly ILogger<OptimizedProductRepository> _logger;
public OptimizedProductRepository(
ApplicationDbContext context,
ILogger<OptimizedProductRepository> logger)
{
_context = context;
_logger = logger;
}
// Pagination with keyset pagination (better than OFFSET)
public async Task<PaginatedResult<Product>> GetProductsKeysetPaginationAsync(
int? lastId = null,
int pageSize = 50)
{
var query = _context.Products
.Where(p => p.IsActive)
.OrderBy(p => p.Id)
.Include(p => p.Category)
.AsNoTracking();
if (lastId.HasValue)
{
query = query.Where(p => p.Id > lastId.Value);
}
var products = await query
.Take(pageSize + 1) // Get one extra to check for next page
.ToListAsync();
var hasNextPage = products.Count > pageSize;
if (hasNextPage)
{
products = products.Take(pageSize).ToList();
}
return new PaginatedResult<Product>
{
Items = products,
HasNextPage = hasNextPage,
LastId = products.LastOrDefault()?.Id
};
}
// Efficient search with full-text capabilities
public async Task<List<Product>> SearchProductsAdvancedAsync(string searchTerm)
{
// Use EF.Functions.FreeText for full-text search if available
if (!string.IsNullOrWhiteSpace(searchTerm))
{
return await _context.Products
.Where(p => EF.Functions.FreeText(p.Name, searchTerm) ||
EF.Functions.FreeText(p.Description, searchTerm))
.AsNoTracking()
.Take(100)
.ToListAsync();
}
return await _context.Products
.Where(p => p.IsActive)
.AsNoTracking()
.Take(50)
.ToListAsync();
}
// Batch processing for large datasets
public async Task ProcessProductsInBatchesAsync(
Func<List<Product>, Task> processor,
int batchSize = 1000)
{
var totalProcessed = 0;
var hasMore = true;
while (hasMore)
{
var products = await _context.Products
.Where(p => p.IsActive)
.OrderBy(p => p.Id)
.Skip(totalProcessed)
.Take(batchSize)
.AsNoTracking()
.ToListAsync();
if (products.Any())
{
await processor(products);
totalProcessed += products.Count;
_logger.LogInformation("Processed {Count} products, total: {Total}",
products.Count, totalProcessed);
}
else
{
hasMore = false;
}
// Prevent memory issues
if (totalProcessed % 10000 == 0)
{
GC.Collect();
}
}
}
// Stored procedure execution
public async Task<List<Product>> GetProductsByCategoryStoredProcAsync(int categoryId)
{
var categoryIdParam = new SqlParameter("@CategoryId", categoryId);
return await _context.Products
.FromSqlRaw("EXEC dbo.GetProductsByCategory @CategoryId", categoryIdParam)
.AsNoTracking()
.ToListAsync();
}
}
8. Memory Management
Object Pooling Implementation
public interface IObjectPool<T>
{
T Get();
void Return(T item);
}
public class ObjectPool<T> : IObjectPool<T> where T : class, new()
{
private readonly ConcurrentBag<T> _objects;
private readonly Func<T> _objectGenerator;
private readonly Action<T> _resetAction;
private readonly int _maxSize;
public ObjectPool(
Func<T> objectGenerator = null,
Action<T> resetAction = null,
int maxSize = 100)
{
_objects = new ConcurrentBag<T>();
_objectGenerator = objectGenerator ?? (() => new T());
_resetAction = resetAction;
_maxSize = maxSize;
}
public T Get()
{
if (_objects.TryTake(out T item))
{
return item;
}
return _objectGenerator();
}
public void Return(T item)
{
if (_objects.Count < _maxSize)
{
_resetAction?.Invoke(item);
_objects.Add(item);
}
}
}
// Usage for HttpClient pooling
public class HttpClientPool
{
private readonly ObjectPool<HttpClient> _pool;
public HttpClientPool()
{
_pool = new ObjectPool<HttpClient>(
() => new HttpClient { Timeout = TimeSpan.FromSeconds(30) },
client => client.DefaultRequestHeaders.Clear(),
maxSize: 50);
}
public async Task<string> GetStringAsync(string url)
{
var client = _pool.Get();
try
{
return await client.GetStringAsync(url);
}
finally
{
_pool.Return(client);
}
}
}
Memory-Optimized Services
public class MemoryOptimizedProductService
{
private readonly ApplicationDbContext _context;
private readonly ILogger<MemoryOptimizedProductService> _logger;
public MemoryOptimizedProductService(
ApplicationDbContext context,
ILogger<MemoryOptimizedProductService> logger)
{
_context = context;
_logger = logger;
}
// Use structs for better memory efficiency
public readonly struct ProductSummary
{
public int Id { get; }
public string Name { get; }
public decimal Price { get; }
public string CategoryName { get; }
public ProductSummary(int id, string name, decimal price, string categoryName)
{
Id = id;
Name = name;
Price = price;
CategoryName = categoryName;
}
}
public async Task<List<ProductSummary>> GetProductSummariesAsync(int categoryId)
{
return await _context.Products
.Where(p => p.CategoryId == categoryId && p.IsActive)
.Select(p => new ProductSummary(
p.Id,
p.Name,
p.Price,
p.Category.Name))
.AsNoTracking()
.ToListAsync();
}
// Stream large results to avoid memory pressure
public async IAsyncEnumerable<Product> StreamLargeProductSetAsync()
{
await using var command = _context.Database.GetDbConnection().CreateCommand();
command.CommandText = "SELECT Id, Name, Price FROM Products WHERE IsActive = 1";
await _context.Database.OpenConnectionAsync();
await using var reader = await command.ExecuteReaderAsync(CommandBehavior.SequentialAccess);
while (await reader.ReadAsync())
{
yield return new Product
{
Id = reader.GetInt32(0),
Name = reader.GetString(1),
Price = reader.GetDecimal(2)
};
}
}
// Memory-efficient batch processing
public async Task ProcessLargeDatasetEfficientlyAsync()
{
const int batchSize = 1000;
var totalProcessed = 0;
while (true)
{
var products = await _context.Products
.Where(p => p.Id > totalProcessed)
.OrderBy(p => p.Id)
.Take(batchSize)
.AsNoTracking()
.Select(p => new { p.Id, p.Name })
.ToListAsync();
if (!products.Any())
break;
// Process batch
foreach (var product in products)
{
// Lightweight processing
await ProcessProductAsync(product);
}
totalProcessed += products.Count;
// Force garbage collection periodically
if (totalProcessed % 10000 == 0)
{
GC.Collect();
_logger.LogInformation("Processed {Total} items, memory: {Memory}MB",
totalProcessed, GC.GetTotalMemory(false) / 1024 / 1024);
}
}
}
private Task ProcessProductAsync(object product)
{
// Simulate processing
return Task.CompletedTask;
}
}
9. Profiling Tools
Application Insights Integration
public static class TelemetryConfiguration
{
public static void ConfigureTelemetry(IServiceCollection services, IConfiguration configuration)
{
services.AddApplicationInsightsTelemetry(options =>
{
options.EnableAdaptiveSampling = false; // Get all telemetry for debugging
options.EnablePerformanceCounterCollectionModule = true;
options.EnableDependencyTrackingTelemetryModule = true;
});
services.AddApplicationInsightsTelemetryProcessor<CustomFilteringTelemetryProcessor>();
// Configure logging to Application Insights
services.AddLogging(logging =>
{
logging.AddApplicationInsights();
logging.AddFilter<ApplicationInsightsLoggerProvider>("", LogLevel.Information);
});
}
}
public class CustomFilteringTelemetryProcessor : ITelemetryProcessor
{
private readonly ITelemetryProcessor _next;
public CustomFilteringTelemetryProcessor(ITelemetryProcessor next)
{
_next = next;
}
public void Process(ITelemetry item)
{
// Filter out health check requests
if (item is RequestTelemetry request &&
request.Url.ToString().Contains("/health"))
{
return;
}
// Filter out static files
if (item is RequestTelemetry staticRequest &&
(staticRequest.Url.ToString().EndsWith(".css") ||
staticRequest.Url.ToString().EndsWith(".js") ||
staticRequest.Url.ToString().EndsWith(".png")))
{
return;
}
_next.Process(item);
}
}
// Custom telemetry for performance monitoring
public class PerformanceTelemetryService
{
private readonly TelemetryClient _telemetryClient;
public PerformanceTelemetryService(TelemetryClient telemetryClient)
{
_telemetryClient = telemetryClient;
}
public async Task<T> TrackDependencyAsync<T>(
string dependencyType,
string dependencyName,
string command,
Func<Task<T>> operation)
{
var startTime = DateTime.UtcNow;
var timer = System.Diagnostics.Stopwatch.StartNew();
var success = false;
try
{
var result = await operation();
success = true;
return result;
}
finally
{
timer.Stop();
_telemetryClient.TrackDependency(
dependencyType,
dependencyName,
command,
startTime,
timer.Elapsed,
success);
}
}
public IDisposable TrackRequest(string name)
{
return new RequestTracker(_telemetryClient, name);
}
private class RequestTracker : IDisposable
{
private readonly TelemetryClient _telemetryClient;
private readonly string _name;
private readonly Stopwatch _stopwatch;
private readonly DateTime _startTime;
public RequestTracker(TelemetryClient telemetryClient, string name)
{
_telemetryClient = telemetryClient;
_name = name;
_stopwatch = Stopwatch.StartNew();
_startTime = DateTime.UtcNow;
}
public void Dispose()
{
_stopwatch.Stop();
_telemetryClient.TrackRequest(_name, _startTime, _stopwatch.Elapsed, "200", true);
}
}
}
MiniProfiler Integration
public static class ProfilerConfiguration
{
public static void ConfigureProfiler(IServiceCollection services)
{
services.AddMiniProfiler(options =>
{
options.RouteBasePath = "/profiler";
options.ColorScheme = StackExchange.Profiling.ColorScheme.Auto;
options.EnableMvcFilterProfiling = true;
options.EnableMvcViewProfiling = true;
options.SqlFormatter = new StackExchange.Profiling.SqlFormatters.InlineFormatter();
// Track database connections
options.TrackConnectionOpenClose = true;
// Configure storage - use memory for development
options.Storage = new MemoryCacheStorage(
TimeSpan.FromMinutes(60));
}).AddEntityFramework();
}
}
// Custom profiling attributes
[AttributeUsage(AttributeTargets.Method | AttributeTargets.Class)]
public class ProfileAttribute : ActionFilterAttribute
{
private readonly string _name;
public ProfileAttribute(string name = null)
{
_name = name;
}
public override void OnActionExecuting(ActionExecutingContext context)
{
var profiler = MiniProfiler.Current;
if (profiler != null)
{
var stepName = _name ?? $"{context.ActionDescriptor.DisplayName}";
context.HttpContext.Items["ProfilingStep"] = profiler.Step(stepName);
}
base.OnActionExecuting(context);
}
public override void OnActionExecuted(ActionExecutedContext context)
{
if (context.HttpContext.Items["ProfilingStep"] is IDisposable step)
{
step.Dispose();
}
base.OnActionExecuted(context);
}
}
// Usage in controllers
[ApiController]
[Route("api/[controller]")]
[Profile("Products API")]
public class ProductsController : ControllerBase
{
private readonly ApplicationDbContext _context;
public ProductsController(ApplicationDbContext context)
{
_context = context;
}
[HttpGet]
[Profile("Get All Products")]
public async Task<IActionResult> GetProducts()
{
using (MiniProfiler.Current.Step("Database Query"))
{
var products = await _context.Products
.Include(p => p.Category)
.AsNoTracking()
.ToListAsync();
return Ok(products);
}
}
}
10. Real-World Case Study
E-commerce Platform Performance Optimization
// Before optimization
public class LegacyProductService
{
private readonly ApplicationDbContext _context;
public LegacyProductService(ApplicationDbContext context)
{
_context = context;
}
public async Task<Product> GetProductDetails(int id)
{
// Multiple separate queries - N+1 problem
var product = await _context.Products.FindAsync(id);
if (product == null) return null;
product.Category = await _context.Categories.FindAsync(product.CategoryId);
product.Supplier = await _context.Suppliers.FindAsync(product.SupplierId);
product.Images = await _context.ProductImages
.Where(img => img.ProductId == id)
.ToListAsync();
product.Reviews = await _context.Reviews
.Where(r => r.ProductId == id)
.Include(r => r.User)
.ToListAsync();
return product;
}
}
// After optimization
public class OptimizedProductService
{
private readonly ApplicationDbContext _context;
private readonly ICacheService _cacheService;
private readonly ILogger<OptimizedProductService> _logger;
public OptimizedProductService(
ApplicationDbContext context,
ICacheService cacheService,
ILogger<OptimizedProductService> logger)
{
_context = context;
_cacheService = cacheService;
_logger = logger;
}
public async Task<ProductDto> GetProductDetailsAsync(int id)
{
var cacheKey = $"product_details_{id}";
return await _cacheService.GetOrSetAsync(
cacheKey,
async () => await GetProductDetailsFromDbAsync(id),
TimeSpan.FromMinutes(15));
}
private async Task<ProductDto> GetProductDetailsFromDbAsync(int id)
{
using (_logger.BeginScope("Loading product details for {ProductId}", id))
{
// Single query with all includes
var product = await _context.Products
.Where(p => p.Id == id)
.Select(p => new ProductDto
{
Id = p.Id,
Name = p.Name,
Description = p.Description,
Price = p.Price,
StockQuantity = p.StockQuantity,
Category = new CategoryDto
{
Id = p.Category.Id,
Name = p.Category.Name,
Slug = p.Category.Slug
},
Supplier = new SupplierDto
{
Id = p.Supplier.Id,
Name = p.Supplier.Name,
Rating = p.Supplier.Rating
},
Images = p.Images.Select(img => new ImageDto
{
Id = img.Id,
Url = img.Url,
AltText = img.AltText
}).ToList(),
Reviews = p.Reviews.Select(r => new ReviewDto
{
Id = r.Id,
Rating = r.Rating,
Comment = r.Comment,
CreatedAt = r.CreatedAt,
UserName = r.User.UserName
}).ToList(),
AverageRating = p.Reviews.Average(r => (double?)r.Rating) ?? 0.0,
ReviewCount = p.Reviews.Count
})
.AsNoTracking()
.AsSplitQuery() // Split query to avoid cartesian explosion
.FirstOrDefaultAsync();
if (product == null)
{
_logger.LogWarning("Product {ProductId} not found", id);
}
return product;
}
}
// Batch processing for product updates
public async Task UpdateProductPricesAsync(
Dictionary<int, decimal> priceUpdates,
CancellationToken cancellationToken = default)
{
using var transaction = await _context.Database.BeginTransactionAsync();
try
{
// Efficient bulk update
foreach (var update in priceUpdates)
{
await _context.Products
.Where(p => p.Id == update.Key)
.ExecuteUpdateAsync(
p => p.SetProperty(x => x.Price, x => update.Value)
.SetProperty(x => x.UpdatedAt, x => DateTime.UtcNow),
cancellationToken);
}
await transaction.CommitAsync();
// Invalidate cache for updated products
foreach (var productId in priceUpdates.Keys)
{
await _cacheService.RemoveAsync($"product_details_{productId}");
}
_logger.LogInformation(
"Updated prices for {Count} products", priceUpdates.Count);
}
catch (Exception ex)
{
await transaction.RollbackAsync();
_logger.LogError(ex, "Failed to update product prices");
throw;
}
}
}
Performance Monitoring Dashboard
public class PerformanceDashboardController : ControllerBase
{
private readonly ApplicationDbContext _context;
private readonly TelemetryClient _telemetryClient;
private readonly IMemoryCache _memoryCache;
public PerformanceDashboardController(
ApplicationDbContext context,
TelemetryClient telemetryClient,
IMemoryCache memoryCache)
{
_context = context;
_telemetryClient = telemetryClient;
_memoryCache = memoryCache;
}
[HttpGet("api/dashboard/performance")]
public async Task<IActionResult> GetPerformanceMetrics()
{
var metrics = new PerformanceMetricsDto
{
Timestamp = DateTime.UtcNow,
// Application metrics
MemoryUsage = GC.GetTotalMemory(false),
ActiveRequests = GetActiveRequestCount(),
ThreadPoolStats = GetThreadPoolStats(),
// Database metrics
DatabaseMetrics = await GetDatabaseMetricsAsync(),
// Cache metrics
CacheMetrics = GetCacheMetrics(),
// System metrics
CpuUsage = await GetCpuUsageAsync(),
NetworkStats = GetNetworkStats()
};
// Track custom metric
_telemetryClient.TrackMetric("PerformanceDashboardQuery", 1);
return Ok(metrics);
}
[HttpGet("api/dashboard/slow-queries")]
public async Task<IActionResult> GetSlowQueries()
{
var slowQueries = await _context.QueryMetrics
.Where(q => q.Duration > TimeSpan.FromSeconds(1))
.OrderByDescending(q => q.Duration)
.Take(50)
.Select(q => new SlowQueryDto
{
Sql = q.Sql,
Duration = q.Duration,
Timestamp = q.Timestamp,
Parameters = q.Parameters
})
.ToListAsync();
return Ok(slowQueries);
}
[HttpPost("api/dashboard/clear-cache")]
public async Task<IActionResult> ClearCache()
{
// Clear memory cache
if (_memoryCache is MemoryCache memoryCache)
{
memoryCache.Compact(1.0); // Clear 100% of cache
}
// Track cache clearance
_telemetryClient.TrackEvent("CacheCleared");
return Ok(new { message = "Cache cleared successfully" });
}
private async Task<DatabaseMetricsDto> GetDatabaseMetricsAsync()
{
var connection = _context.Database.GetDbConnection();
return new DatabaseMetricsDto
{
ActiveConnections = await GetActiveConnectionsAsync(connection),
ConnectionPoolSize = await GetConnectionPoolSizeAsync(connection),
SlowQueryCount = await _context.QueryMetrics
.CountAsync(q => q.Duration > TimeSpan.FromSeconds(1))
};
}
private ThreadPoolStatsDto GetThreadPoolStats()
{
ThreadPool.GetAvailableThreads(out var workerThreads, out var completionPortThreads);
ThreadPool.GetMaxThreads(out var maxWorkerThreads, out var maxCompletionPortThreads);
return new ThreadPoolStatsDto
{
AvailableWorkerThreads = workerThreads,
AvailableCompletionPortThreads = completionPortThreads,
MaxWorkerThreads = maxWorkerThreads,
MaxCompletionPortThreads = maxCompletionPortThreads
};
}
}
11. Advanced Performance Patterns
Background Processing with Hangfire
public static class BackgroundJobConfiguration
{
public static void ConfigureBackgroundJobs(IServiceCollection services, IConfiguration configuration)
{
// Add Hangfire services
services.AddHangfire(config => config
.SetDataCompatibilityLevel(CompatibilityLevel.Version_170)
.UseSimpleAssemblyNameTypeSerializer()
.UseRecommendedSerializerSettings()
.UseSqlServerStorage(configuration.GetConnectionString("HangfireConnection")));
// Add the processing server as IHostedService
services.AddHangfireServer(options =>
{
options.WorkerCount = Environment.ProcessorCount * 2;
options.Queues = new[] { "default", "emails", "reports" };
});
}
}
public class ProductBackgroundService
{
private readonly IBackgroundJobClient _backgroundJob;
private readonly ILogger<ProductBackgroundService> _logger;
public ProductBackgroundService(
IBackgroundJobClient backgroundJob,
ILogger<ProductBackgroundService> logger)
{
_backgroundJob = backgroundJob;
_logger = logger;
}
public string SchedulePriceUpdate(int productId, decimal newPrice, DateTime effectiveDate)
{
return _backgroundJob.Schedule<ProductPriceService>(
service => service.UpdateProductPriceAsync(productId, newPrice, effectiveDate),
effectiveDate);
}
public void QueueInventorySync(int supplierId)
{
_backgroundJob.Enqueue<InventorySyncService>(
service => service.SyncSupplierInventoryAsync(supplierId));
}
public void RecurringSalesReport()
{
RecurringJob.AddOrUpdate<SalesReportService>(
"daily-sales-report",
service => service.GenerateDailySalesReportAsync(),
Cron.Daily(2, 0)); // Run daily at 2 AM
}
}
public class ProductPriceService
{
private readonly ApplicationDbContext _context;
private readonly ICacheService _cacheService;
private readonly ILogger<ProductPriceService> _logger;
public ProductPriceService(
ApplicationDbContext context,
ICacheService cacheService,
ILogger<ProductPriceService> logger)
{
_context = context;
_cacheService = cacheService;
_logger = logger;
}
[AutomaticRetry(Attempts = 3, DelaysInSeconds = new[] { 30, 60, 120 })]
public async Task UpdateProductPriceAsync(int productId, decimal newPrice, DateTime effectiveDate)
{
using var transaction = await _context.Database.BeginTransactionAsync();
try
{
// Update price
await _context.Products
.Where(p => p.Id == productId)
.ExecuteUpdateAsync(p => p
.SetProperty(x => x.Price, newPrice)
.SetProperty(x => x.UpdatedAt, DateTime.UtcNow));
// Log price change
var priceChange = new PriceChangeHistory
{
ProductId = productId,
OldPrice = await _context.PriceChangeHistory
.Where(p => p.ProductId == productId)
.OrderByDescending(p => p.ChangedAt)
.Select(p => p.NewPrice)
.FirstOrDefaultAsync(),
NewPrice = newPrice,
ChangedAt = DateTime.UtcNow,
EffectiveFrom = effectiveDate
};
_context.PriceChangeHistory.Add(priceChange);
await _context.SaveChangesAsync();
// Invalidate cache
await _cacheService.RemoveAsync($"product_{productId}");
await _cacheService.RemoveAsync("products_list");
await transaction.CommitAsync();
_logger.LogInformation(
"Updated price for product {ProductId} to {NewPrice}", productId, newPrice);
}
catch (Exception ex)
{
await transaction.RollbackAsync();
_logger.LogError(ex, "Failed to update price for product {ProductId}", productId);
throw;
}
}
}
Real-Time Performance Monitoring
public class RealTimePerformanceMonitor : IHostedService, IDisposable
{
private readonly ILogger<RealTimePerformanceMonitor> _logger;
private readonly TelemetryClient _telemetryClient;
private readonly IServiceProvider _serviceProvider;
private Timer _timer;
public RealTimePerformanceMonitor(
ILogger<RealTimePerformanceMonitor> logger,
TelemetryClient telemetryClient,
IServiceProvider serviceProvider)
{
_logger = logger;
_telemetryClient = telemetryClient;
_serviceProvider = serviceProvider;
}
public Task StartAsync(CancellationToken cancellationToken)
{
_timer = new Timer(CollectMetrics, null, TimeSpan.Zero, TimeSpan.FromSeconds(30));
return Task.CompletedTask;
}
private async void CollectMetrics(object state)
{
try
{
using var scope = _serviceProvider.CreateScope();
var context = scope.ServiceProvider.GetRequiredService<ApplicationDbContext>();
var metrics = new
{
Timestamp = DateTime.UtcNow,
Memory = GC.GetTotalMemory(false),
DatabaseConnections = await GetActiveDatabaseConnections(context),
RequestRate = await GetCurrentRequestRate(),
ErrorRate = await GetCurrentErrorRate()
};
// Track custom metrics
_telemetryClient.TrackMetric("MemoryUsage", metrics.Memory);
_telemetryClient.TrackMetric("DatabaseConnections", metrics.DatabaseConnections);
_telemetryClient.TrackMetric("RequestRate", metrics.RequestRate);
_telemetryClient.TrackMetric("ErrorRate", metrics.ErrorRate);
// Log if metrics exceed thresholds
if (metrics.Memory > 500 * 1024 * 1024) // 500MB
{
_logger.LogWarning("High memory usage detected: {Memory} bytes", metrics.Memory);
}
if (metrics.DatabaseConnections > 100)
{
_logger.LogWarning("High database connections: {Connections}", metrics.DatabaseConnections);
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Error collecting performance metrics");
}
}
public Task StopAsync(CancellationToken cancellationToken)
{
_timer?.Change(Timeout.Infinite, 0);
return Task.CompletedTask;
}
public void Dispose()
{
_timer?.Dispose();
}
private async Task<int> GetActiveDatabaseConnections(ApplicationDbContext context)
{
var connection = context.Database.GetDbConnection();
if (connection.State == ConnectionState.Closed)
{
await connection.OpenAsync();
}
using var command = connection.CreateCommand();
command.CommandText = "SELECT COUNT(*) FROM sys.dm_exec_sessions WHERE database_id = DB_ID()";
return Convert.ToInt32(await command.ExecuteScalarAsync());
}
private Task<double> GetCurrentRequestRate()
{
// Implementation depends on your metrics collection system
return Task.FromResult(0.0);
}
private Task<double> GetCurrentErrorRate()
{
// Implementation depends on your metrics collection system
return Task.FromResult(0.0);
}
}
12. Monitoring and Diagnostics
Health Checks with Custom Metrics
public static class HealthCheckConfiguration
{
public static void ConfigureHealthChecks(IServiceCollection services, IConfiguration configuration)
{
services.AddHealthChecks()
// Database health check
.AddSqlServer(
configuration.GetConnectionString("DefaultConnection"),
name: "database",
timeout: TimeSpan.FromSeconds(5),
tags: new[] { "ready", "live" })
// Redis health check
.AddRedis(
configuration.GetConnectionString("Redis"),
name: "redis",
tags: new[] { "ready", "live" })
// Custom health checks
.AddCheck<MemoryHealthCheck>("memory", tags: new[] { "live" })
.AddCheck<DatabasePerformanceHealthCheck>("database_performance", tags: new[] { "ready" })
.AddCheck<ExternalApiHealthCheck>("external_api", tags: new[] { "live" })
// Application specific health checks
.AddCheck<OrderProcessingHealthCheck>("order_processing", tags: new[] { "ready" });
// Health check UI (for development)
services.AddHealthChecksUI()
.AddInMemoryStorage();
}
}
public class MemoryHealthCheck : IHealthCheck
{
private readonly long _thresholdBytes;
public MemoryHealthCheck(long thresholdBytes = 500 * 1024 * 1024) // 500MB
{
_thresholdBytes = thresholdBytes;
}
public Task<HealthCheckResult> CheckHealthAsync(
HealthCheckContext context,
CancellationToken cancellationToken = default)
{
var memory = GC.GetTotalMemory(false);
var status = memory < _thresholdBytes
? HealthStatus.Healthy
: HealthStatus.Degraded;
var data = new Dictionary<string, object>
{
["MemoryBytes"] = memory,
["MemoryMB"] = memory / 1024 / 1024,
["ThresholdBytes"] = _thresholdBytes,
["Gen0Collections"] = GC.CollectionCount(0),
["Gen1Collections"] = GC.CollectionCount(1),
["Gen2Collections"] = GC.CollectionCount(2)
};
var result = status == HealthStatus.Healthy
? HealthCheckResult.Healthy($"Memory usage is normal: {memory / 1024 / 1024}MB", data)
: HealthCheckResult.Degraded($"High memory usage: {memory / 1024 / 1024}MB", data);
return Task.FromResult(result);
}
}
public class DatabasePerformanceHealthCheck : IHealthCheck
{
private readonly ApplicationDbContext _context;
private readonly ILogger<DatabasePerformanceHealthCheck> _logger;
public DatabasePerformanceHealthCheck(
ApplicationDbContext context,
ILogger<DatabasePerformanceHealthCheck> logger)
{
_context = context;
_logger = logger;
}
public async Task<HealthCheckResult> CheckHealthAsync(
HealthCheckContext context,
CancellationToken cancellationToken = default)
{
try
{
var stopwatch = Stopwatch.StartNew();
// Test query performance
var result = await _context.Products
.OrderBy(p => p.Id)
.Take(1)
.AsNoTracking()
.FirstOrDefaultAsync(cancellationToken);
stopwatch.Stop();
var data = new Dictionary<string, object>
{
["QueryTimeMs"] = stopwatch.ElapsedMilliseconds,
["Timestamp"] = DateTime.UtcNow
};
if (stopwatch.ElapsedMilliseconds < 100)
{
return HealthCheckResult.Healthy(
$"Database response time normal: {stopwatch.ElapsedMilliseconds}ms", data);
}
else if (stopwatch.ElapsedMilliseconds < 500)
{
return HealthCheckResult.Degraded(
$"Database response time slow: {stopwatch.ElapsedMilliseconds}ms", data);
}
else
{
return HealthCheckResult.Unhealthy(
$"Database response time very slow: {stopwatch.ElapsedMilliseconds}ms", data);
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Database health check failed");
return HealthCheckResult.Unhealthy("Database health check failed", ex);
}
}
}
Structured Logging with Serilog
public static class LoggingConfiguration
{
public static void ConfigureLogging(WebApplicationBuilder builder)
{
builder.Host.UseSerilog((context, services, configuration) =>
{
configuration
.ReadFrom.Configuration(context.Configuration)
.ReadFrom.Services(services)
.Enrich.FromLogContext()
.Enrich.WithMachineName()
.Enrich.WithThreadId()
.Enrich.WithProperty("Application", "ECommerceApp")
.WriteTo.Console(
outputTemplate: "[{Timestamp:HH:mm:ss} {Level:u3}] {Message:lj} {Properties:j}{NewLine}{Exception}")
.WriteTo.File(
"logs/app-.log",
rollingInterval: RollingInterval.Day,
outputTemplate: "{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} [{Level:u3}] {Message:lj} {Properties:j}{NewLine}{Exception}")
.WriteTo.ApplicationInsights(
services.GetRequiredService<TelemetryClient>(),
TelemetryConverter.Traces);
});
}
}
// Performance logging middleware
public class PerformanceLoggingMiddleware
{
private readonly RequestDelegate _next;
private readonly ILogger<PerformanceLoggingMiddleware> _logger;
public PerformanceLoggingMiddleware(
RequestDelegate next,
ILogger<PerformanceLoggingMiddleware> logger)
{
_next = next;
_logger = logger;
}
public async Task InvokeAsync(HttpContext context)
{
var stopwatch = Stopwatch.StartNew();
var startMemory = GC.GetTotalMemory(false);
// Add correlation ID for request tracing
context.Response.Headers["X-Correlation-ID"] = context.TraceIdentifier;
using (LogContext.PushProperty("CorrelationId", context.TraceIdentifier))
using (LogContext.PushProperty("RequestPath", context.Request.Path))
using (LogContext.PushProperty("RequestMethod", context.Request.Method))
{
try
{
await _next(context);
stopwatch.Stop();
var endMemory = GC.GetTotalMemory(false);
LogRequestPerformance(
context,
stopwatch.ElapsedMilliseconds,
endMemory - startMemory,
context.Response.StatusCode);
}
catch (Exception ex)
{
stopwatch.Stop();
LogRequestError(context, stopwatch.ElapsedMilliseconds, ex);
throw;
}
}
}
private void LogRequestPerformance(
HttpContext context,
long elapsedMs,
long memoryUsed,
int statusCode)
{
var logData = new
{
Path = context.Request.Path,
Method = context.Request.Method,
StatusCode = statusCode,
DurationMs = elapsedMs,
MemoryUsed = memoryUsed,
UserAgent = context.Request.Headers.UserAgent.ToString(),
ClientIP = context.Connection.RemoteIpAddress?.ToString()
};
if (elapsedMs > 1000)
{
_logger.LogWarning(
"Slow request completed in {ElapsedMs}ms: {Method} {Path}",
elapsedMs, context.Request.Method, context.Request.Path);
}
else
{
_logger.LogInformation(
"Request completed in {ElapsedMs}ms: {Method} {Path} - Status: {StatusCode}",
elapsedMs, context.Request.Method, context.Request.Path, statusCode);
}
}
private void LogRequestError(HttpContext context, long elapsedMs, Exception ex)
{
_logger.LogError(
ex,
"Request failed after {ElapsedMs}ms: {Method} {Path}",
elapsedMs, context.Request.Method, context.Request.Path);
}
}
13. Conclusion
Performance optimization in ASP.NET Core is a continuous journey that requires careful planning, monitoring, and iteration. Throughout this comprehensive guide, we've explored numerous techniques and patterns to turbocharge your applications:
Key Takeaways
Async/Await Mastery: Proper use of asynchronous programming is fundamental for I/O-bound operations
Database Optimization: Efficient querying, connection management, and EF Core best practices are crucial
Caching Strategies: Implement multi-level caching with proper invalidation policies
Memory Management: Use object pooling and efficient data structures to reduce pressure
Monitoring and Diagnostics: Comprehensive logging and health checks provide visibility
Background Processing: Offload non-critical work to background services
Compression and Bandwidth: Reduce payload sizes with appropriate compression techniques
Continuous Improvement
Performance optimization is not a one-time task but an ongoing process. Implement these practices:
Regular performance testing and benchmarking
Continuous monitoring with alerting
Code reviews focused on performance
Capacity planning and scaling strategies
Staying updated with .NET performance improvements
Final Performance Checklist
public class PerformanceChecklist
{
public async Task<bool> ValidateApplicationPerformance()
{
var checks = new List<Func<Task<bool>>>
{
CheckDatabaseConnectionPerformance,
CheckMemoryUsage,
CheckResponseTimes,
CheckCacheEffectiveness,
CheckBackgroundProcessing,
CheckErrorRates
};
var results = await Task.WhenAll(checks.Select(check => check()));
return results.All(result => result);
}
private async Task<bool> CheckDatabaseConnectionPerformance()
{
// Implement database performance checks
return await Task.FromResult(true);
}
private async Task<bool> CheckMemoryUsage()
{
var memory = GC.GetTotalMemory(false);
return await Task.FromResult(memory < 500 * 1024 * 1024); // Under 500MB
}
private async Task<bool> CheckResponseTimes()
{
// Implement response time checks
return await Task.FromResult(true);
}
private async Task<bool> CheckCacheEffectiveness()
{
// Implement cache hit ratio checks
return await Task.FromResult(true);
}
private async Task<bool> CheckBackgroundProcessing()
{
// Implement background job health checks
return await Task.FromResult(true);
}
private async Task<bool> CheckErrorRates()
{
// Implement error rate monitoring
return await Task.FromResult(true);
}
}
Remember that every application is unique, and the most effective performance optimizations will depend on your specific use cases, workload patterns, and infrastructure. Use profiling tools to identify your actual bottlenecks rather than optimizing based on assumptions.
By implementing these performance hacks and maintaining a performance-first mindset, you'll create ASP.NET Core applications that are not only fast and responsive but also scalable and maintainable in the long term.