I have been browsing around for finding a good alternative for Dapper. That is when I came across RepoDb. RepoDb is a Hybrid ORM framework, that comes between Dapper and Entity Framework Core.
I absolutely love the simplicity of Dapper but at the same time, I miss the Auto-generated SQL at times, which comes with Entity Framework Core. Because let us agree, there are times, when writing a simple SQL statement becomes mundane and unnecessary.
This is where RepoDb shines the most. It brings all the simplicity that Dapper provides as a Micro ORM. But at the same time, encapsulates some of the complex operations as Entity Framework Core.
In this blog post, I will explore the basic features of RepoDb. And as I explore, I will compare/contrast with Dapper and Entity Framework Core where it makes sense.
Creating RepoDb based Application
Firstly, I will create a new ASP.Net Core Web API application. To do that, I will open up Visual Studio 2019. Once Visual Studio opens up, I will select the menu File -> New -> Project. This will open the Create a new Project project popup window.
Secondly, in the Create a new Project popup window, I will select ASP.NET Core Web Application from the project template and click on the Next button.
Thirdly, on the next page, I will provide the name of the application as RepoDb.Demo
and click on the Create button.
Finally, on the final page, I will select the API template option. And I will keep other values default (ASP.Net Core 3.1) and click on the Create button.
Adding RepoDb NuGet package
Once the project is ready, I will add the RepoDb Nuget package to the newly created project. To do so, I will right-click on the RepoDb.Demo
project and select the option Manage NuGet Packages…
This will open up the NuGet package manager window. In the Browse tab of the NuGet package manager window, I will search for the RepoDb NuGet package.
I will install version 1.11.3 of RepoDb NuGet package into the project.
Next, I will install version 1.0.10 of RepoDb.SqlServer NuGet package. This NuGet package has a dependency on System.Data.SqlClient NuGet package, Microsoft.Data.SqlClient NuGet package, and RepoDb NuGet package.
Creating a Database
Since we will be accessing data using RepoDb, hence, I will create a new database in SQL Server. I will name the database as RepoDbDemo.
Once the database RepoDbDemo is created, I will create tables inside the database.
For the demo application, let us build a couple of tables from an inventory management solution. The first table is a warehouse listing table. And the second one is a product listing table.
Firstly, I will create a table named Warehouse. And for simplicity, I will keep 3 columns in this table. The Id of the warehouse, the warehouse name, and the address of the warehouse.
Secondly, I will create another table named Product. And for simplicity, I will keep only three columns in this table as well. The Id of the product, the name of the product, and the count of the product available in inventory.
Accessing database with RepoDb
Now that the database as well as the ASP.Net Core Web application is ready, it is time to start using RepoDb to interact with the database.
As I mentioned at the beginning of this article, tough RepoDb is a Micro ORM, it still has a lot of advanced capabilities as Entity Framework Core.
Now, just like Dapper, RepoDb operations are also implemented as Extension methods of the IDbConnection
interface. And as long as the connection is open, we can perform any operation on the database through RepoDb.
So, first, let me create a new Warehouse, using the Auto-generated SQL capability provided by RepoDb.
Ways of creating a record
There are few ways to insert data into a database table using RepoDb.
- Firstly, creating a class deriving from the
BaseRepository
class - Secondly, using
Insert
extension method ofIDbConnection
to pass an object - Finally, using the
Insert
extension method ofIDbConnection
along with Inline query or a stored procedure
Using BaseRepository class
To demonstrate how to use BaseRepository
, I will create a new class WarehouseRepository
. And this new class will derive from the BaseRepository
class. Along with the WarehouseRepository
class, I will also create the Warehouse
model class, which will have mapping properties as the Warehouse database tables columns.
For the WarehouseRepository
class, I will pass the Warehouse
model and SqlConnection
types to the generic BaseRepository
. This will inform the BaseRepository
that I will be using a SqlConnection
for connecting to the database, and Warehouse
is my data model and the database table.
Finally, inside of the WarehouseRepository
, I will define a constructor with a connection string parameter so that I can use the base constructor with the connection string parameter.
NOTE: BaseRepository
comes with multiple overloads of the constructor. For most of the use cases, we will end up using the constructor with the connection string.
namespace RepoDb.Demo
{
public class Warehouse
{
public int Id { get; set; }
public string Name { get; set; }
public string Address { get; set; }
}
}
using Microsoft.Data.SqlClient;
namespace RepoDb.Demo
{
public class WarehouseRepository : BaseRepository<Warehouse, SqlConnection>
{
public WarehouseRepository(string connectionString) : base(connectionString)
{
}
}
}
Controller for creating Warehouse
After the repository class is ready, I will create a new controller, which will be responsible for exposing the warehouse data.
To achieve this, I will right-click on the Controller folder and select New -> Controller. In the new controller window, I will select the API Controller – Empty option to create a new empty API controller. And finally, I will name the controller as WarehouseController
.
Inside the controller, I will take WarehouseRepository
as a dependency in the constructor. And then I will create a Post
method, which will use the instance of WarehouseRepository
to create a new warehouse record from the incoming request object.
using System;
using Microsoft.AspNetCore.Mvc;
namespace RepoDb.Demo.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class WarehouseController : ControllerBase
{
private readonly WarehouseRepository warehouseRepository;
public WarehouseController(WarehouseRepository warehouseRepository)
{
this.warehouseRepository = warehouseRepository;
}
[HttpPost]
public IActionResult Post([FromBody] Warehouse warehouse)
{
try
{
warehouseRepository.Insert(warehouse);
return Ok();
}
catch (Exception exc)
{
return Problem(exc.Message);
}
}
}
}
Configure Startup
Finally, I will configure the Startup
class to wire up everything in the dependency injection container.
But before I do that, I will add the connection string for the RepoDbDemo database in the appsettings files.
{
"Logging": {
"LogLevel": {
"Default": "Trace",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"AllowedHosts": "*",
"ConnectionString": "Persist Security Info = False; Integrated Security = true; Initial Catalog = RepoDbDemo; server = .\\SQLEXPRESS;"
}
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
namespace RepoDb.Demo
{
public class Startup
{
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
services.AddControllers();
SqlServerBootstrap.Initialize();
var connection = Configuration.GetValue<string>("ConnectionString");
services.AddSingleton(new WarehouseRepository(connection));
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
app.UseHttpsRedirection();
app.UseRouting();
app.UseAuthorization();
app.UseEndpoints(endpoints =>
{
endpoints.MapControllers();
});
}
}
}
In the Startup
class, I will update the ConfigureServices
method to add a WarehouseRepository
instance to the dependency injection container. And after that, I will call the Initialize
static method on the RepoDb.SqlServerBootstrap
class, to bootstrap SQL Server.
Running the application
Now, I will run the application. Once I run the application, I am going to use Postman to make a POST request to the Warehouse endpoint. This will create a new record in the Warehouse table.
POST /api/warehouse HTTP/1.1
Host: localhost:5091
Content-Type: application/json
{
"name":"Warehouse 1",
"address": "123 State Street"
}
Now, I will check the Warehouse database, and I can see the record is added as expected.
Use IDBConnection Extension method
The next way of creating a record in the database using RepoDb is using the extension method on IDBConnection
. And this is very similar to how Dapper works.
But the main advantage here is that we can either use an in-line query or just use the object.
Let us first write a method using passing an object to the Insert
extension method. To demonstrate that, I will create a new class WarehouseObjectRepo
. And in this class, I will have a method Add
, which will call the Insert
extension method on IDBConnection
to insert a new Warehouse.
This Insert
method will take the Warehouse
object as a parameter.
namespace RepoDb.Demo
{
public interface IWarehouseObjectRepo
{
void Add(Warehouse warehouse);
}
}
using Microsoft.Data.SqlClient;
namespace RepoDb.Demo
{
public class WarehouseObjectRepo : IWarehouseObjectRepo
{
private readonly string connectionString;
public WarehouseObjectRepo(string connectionString)
{
this.connectionString = connectionString;
}
public void Add(Warehouse warehouse)
{
using var connection = new SqlConnection(connectionString);
connection.Insert(warehouse);
}
}
}
Next, I will update the Startup
class to configure WarehouseObjectRepo
inside of the ConfigureServices
method.
services.AddSingleton<IWarehouseObjectRepo>(new WarehouseObjectRepo(connection));
Finally, I will change the controller class to inject the new IWarehouseObjectRepo
interface. And I will use this interface for adding a new warehouse to the database.
using Microsoft.AspNetCore.Mvc;
namespace RepoDb.Demo.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class WarehouseController : ControllerBase
{
private readonly WarehouseRepository warehouseRepository;
private readonly IWarehouseObjectRepo warehouseObjectRepo;
public WarehouseController(WarehouseRepository warehouseRepository,
IWarehouseObjectRepo warehouseObjectRepo)
{
this.warehouseRepository = warehouseRepository;
this.warehouseObjectRepo = warehouseObjectRepo;
}
[HttpPost]
public IActionResult Post([FromBody] Warehouse warehouse)
{
warehouseObjectRepo.Add(warehouse);
return Ok();
}
}
}
Now we can run the application and post a new warehouse using postman to insert in database using this mechanism.
Using In-Line query or Stored Procedures with IDBConnection
Using In-Line query and Stored Procedure call works the same way for RepoDb. For executing both, we will use the ExecuteNonQuery
extension method. And for the parameters for the stored procedure or the inline query, we will use anonymous objects.
If we want to add a warehouse to the database using an inline query, we will use the following code to achieve that.
using var connection = new SqlConnection(connectionString);
connection.ExecuteNonQuery("INSERT INTO Warehouse (Name, Address) VALUES (@Name, @Address)", new { warehouse.Name, warehouse.Address });
And if we want to add a warehouse to the database using a stored procedure, we will use the following code to achieve that.
using var connection = new SqlConnection(connectionString);
connection.ExecuteNonQuery("EXEC dbo.AddWarehouse @Name, @Address",
new { warehouse.Name, warehouse.Address });
Update and Delete of records
The update and delete of data using RepoDb work exactly the same way as insert does. Meaning we can either use BaseRepository
derived class. Or we can use extension methods on IDBConnection
.
For IDBConnection
extension methods, we can either use ExecuteNonQuery
for inline query and stored procedure. Or we can use Update
method for updating a record. And Delete
method for deleting a record.
Update with Update extension method
I will show how update works with the Update
extension method in the below code.
namespace RepoDb.Demo
{
public interface IWareHouseUpdator
{
void Update(Warehouse warehouse);
}
}
using Microsoft.Data.SqlClient;
using RepoDb.Interfaces;
namespace RepoDb.Demo
{
public class WareHouseUpdator : IWareHouseUpdator
{
private readonly string connectionString;
public WareHouseUpdator(string connectionString)
{
this.connectionString = connectionString;
}
public void Update(Warehouse warehouse)
{
using var connection = new SqlConnection(connectionString);
connection.Update(warehouse);
}
}
}
Delete with Delete extension method
I will show how delete works with the Delete
extension method in the below code.
namespace RepoDb.Demo
{
public interface IWareHouseDeletor
{
void Delete(int id);
}
}
using Microsoft.Data.SqlClient;
namespace RepoDb.Demo
{
public class WareHouseDeletor : IWareHouseDeletor
{
private readonly string connectionString;
public WareHouseDeletor(string connectionString)
{
this.connectionString = connectionString;
}
public void Delete(int id)
{
using var connection = new SqlConnection(connectionString);
connection.Delete<Warehouse>(id);
}
}
}
Reading record using RepoDb
Reading of records from database using RepoDb also can be done couple of ways.
Firstly, using the BaseRepository
derived class. And in this case, we can either call Query
and QueryAll
to get the data.
Secondly, using the extension methods on the IDBConnection
object. I am going to demonstrate how to use the IDBConnection
extension method in RepoDb for getting data from the database.
namespace RepoDb.Demo
{
public interface IWarehouseProvider
{
Warehouse[] Get();
Warehouse Get(int id);
}
}
using Microsoft.Data.SqlClient;
using RepoDb.Interfaces;
using System.Linq;
namespace RepoDb.Demo
{
public class WarehouseProvider : IWarehouseProvider
{
private readonly string connectionString;
public WarehouseProvider(string connectionString)
{
this.connectionString = connectionString;
}
public Warehouse[] Get()
{
using var connection = new SqlConnection(connectionString);
return connection.QueryAll<Warehouse>().ToArray();
}
public Warehouse Get(int id)
{
using var connection = new SqlConnection(connectionString);
return connection.Query<Warehouse>(w => w.Id == id).FirstOrDefault();
}
}
}
Caching
RepoDb provides the feature to cache frequently accessed data to avoid unnecessary database round trips.
RepoDb comes with out of box implementation of in-memory caching. But it also provides an interface to extend the caching mechanism to our own cache-store.
Using out of box cache provider
For using the RepoDb provided cache mechanism, the class need to derive from BaseRepository
.
We already defined a class WarehouseRepository
which derived from BaseRepository
. If we want to implement caching during getting data from the database using WarehouseRepository
, all we have to do is to provide a cache key during data retrieval.
Below I will demonstrate how it works.
using Microsoft.AspNetCore.Mvc;
namespace RepoDb.Demo.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class WarehouseController : ControllerBase
{
private readonly WarehouseRepository warehouseRepository;
public WarehouseController(WarehouseRepository warehouseRepository)
{
this.warehouseRepository = warehouseRepository;
}
[HttpGet]
public IActionResult Get()
{
return Ok(warehouseRepository.QueryAll(cacheKey: "all"));
}
}
}
As you can see in the above code, to enable cache all we did is to set a cache key for the QueryAll
extension method. And now all subsequent calls will go to the cache and not the database until the cache is expired.
One disadvantage with this caching mechanism is that we are not in control of where the data is cached or what is the cache expiry duration. Plus we cannot use it with IDBConnection
extension method.
Implementing ICache interface
If we create our own implementation of caching using the ICache
interface, we can manage the cache-store, duration. And above all, it will work with IDBConnection
extension method.
To implement our own cache provider, I will create a new class MemoryCache
which will implement the ICache
interface of RepoDb.
For the demonstration of this feature, I will only implement the Add
overload with key and value. And the Get
method.
using RepoDb.Interfaces;
using System;
using System.Collections;
using System.Collections.Generic;
namespace RepoDb.Demo
{
public class MemoryCache : ICache
{
private IDictionary data;
public void Add<T>(string key, T value, int expiration = 180, bool throwException = true)
{
if (data == null)
data = new Dictionary<string, CacheItem<T>>();
var tempData = (Dictionary<string, CacheItem<T>>)data;
if (tempData.ContainsKey(key) && throwException)
throw new ArgumentException("Error");
data.Add(key, new CacheItem<T>(key, value, expiration));
}
public void Add<T>(CacheItem<T> item, bool throwException = true)
{
throw new NotImplementedException();
}
public void Clear()
{
throw new NotImplementedException();
}
public bool Contains(string key)
{
throw new NotImplementedException();
}
public CacheItem<T> Get<T>(string key, bool throwException = true)
{
if (data == null)
return null;
var tempData = (Dictionary<string, CacheItem<T>>)data;
if (!tempData.ContainsKey(key) && throwException)
throw new ArgumentException("Error");
return (CacheItem<T>)data[key];
}
public IEnumerator GetEnumerator()
{
throw new NotImplementedException();
}
public void Remove(string key, bool throwException = true)
{
throw new NotImplementedException();
}
}
}
Once this is implemented, I will go and update the WarehouseProvider
to use the ICache
implementation to cache data.
using Microsoft.Data.SqlClient;
using RepoDb.Interfaces;
using System.Linq;
namespace RepoDb.Demo
{
public class WarehouseProvider : IWarehouseProvider
{
private readonly string connectionString;
private readonly ICache cache;
public WarehouseProvider(string connectionString, ICache cache)
{
this.connectionString = connectionString;
this.cache = cache;
}
public Warehouse[] Get()
{
using var connection = new SqlConnection(connectionString);
return connection.QueryAll<Warehouse>(cacheKey: "customAll", cache: cache).ToArray();
}
public Warehouse Get(int id)
{
using var connection = new SqlConnection(connectionString);
return connection.Query<Warehouse>(w => w.Id == id).FirstOrDefault();
}
}
}
And I will also update the Startup
class to register the ICache
interface.
services.AddSingleton<ICache, MemoryCache>();
Once this is complete we can run the application and take advantage of caching. Now in this case we can keep our cache in distributed cache like Redis as well.
The advantage of this design is that the caller of the IWarehouseProvider
does not have to worry about managing the cache. It is completely transparent to the caller.
Tracing with RepoDb
Tracing is very important, especially when we use the extension method which takes a direct object. In those cases, we can log the exact SQL queries triggered by the RepoDb framework and see their execution times.
To implement tracing is a two-step process. As a first step, we need to create a class that implements the ITrace
interface.
And in the second step, we will provide the ITrace
instance to the extension method, that we want to trace. The advantage of this way of configuration is that it gives more granularity of control. And at the same time, we can centrally manage if we want to use tracing or not.
The ITrace
interface comes with a lot of methods for tracing. For this demonstration, I will provide the implementation of the AfterUpdate
method alone. I will create a new class CustomTrace
which will implement the ITrace
interface.
using RepoDb.Interfaces;
using System;
namespace RepoDb.Demo
{
public class CustomTrace : ITrace
{
public void AfterAverage(TraceLog log)
{
}
public void AfterAverageAll(TraceLog log)
{
}
public void AfterBatchQuery(TraceLog log)
{
}
public void AfterCount(TraceLog log)
{
}
public void AfterCountAll(TraceLog log)
{
}
public void AfterDelete(TraceLog log)
{
}
public void AfterDeleteAll(TraceLog log)
{
}
public void AfterExecuteNonQuery(TraceLog log)
{
}
public void AfterExecuteQuery(TraceLog log)
{
}
public void AfterExecuteReader(TraceLog log)
{
}
public void AfterExecuteScalar(TraceLog log)
{
}
public void AfterExists(TraceLog log)
{
}
public void AfterInsert(TraceLog log)
{
}
public void AfterInsertAll(TraceLog log)
{
}
public void AfterMax(TraceLog log)
{
}
public void AfterMaxAll(TraceLog log)
{
}
public void AfterMerge(TraceLog log)
{
}
public void AfterMergeAll(TraceLog log)
{
}
public void AfterMin(TraceLog log)
{
}
public void AfterMinAll(TraceLog log)
{
}
public void AfterQuery(TraceLog log)
{
}
public void AfterQueryAll(TraceLog log)
{
}
public void AfterQueryMultiple(TraceLog log)
{
}
public void AfterSum(TraceLog log)
{
}
public void AfterSumAll(TraceLog log)
{
}
public void AfterTruncate(TraceLog log)
{
}
public void AfterUpdate(TraceLog log)
{
Console.WriteLine($"{log.ExecutionTime}, {log.Parameter}, {log.Statement}");
}
public void AfterUpdateAll(TraceLog log)
{
}
public void BeforeAverage(CancellableTraceLog log)
{
}
public void BeforeAverageAll(CancellableTraceLog log)
{
}
public void BeforeBatchQuery(CancellableTraceLog log)
{
}
public void BeforeCount(CancellableTraceLog log)
{
}
public void BeforeCountAll(CancellableTraceLog log)
{
}
public void BeforeDelete(CancellableTraceLog log)
{
}
public void BeforeDeleteAll(CancellableTraceLog log)
{
}
public void BeforeExecuteNonQuery(CancellableTraceLog log)
{
}
public void BeforeExecuteQuery(CancellableTraceLog log)
{
}
public void BeforeExecuteReader(CancellableTraceLog log)
{
}
public void BeforeExecuteScalar(CancellableTraceLog log)
{
}
public void BeforeExists(CancellableTraceLog log)
{
}
public void BeforeInsert(CancellableTraceLog log)
{
}
public void BeforeInsertAll(CancellableTraceLog log)
{
}
public void BeforeMax(CancellableTraceLog log)
{
}
public void BeforeMaxAll(CancellableTraceLog log)
{
}
public void BeforeMerge(CancellableTraceLog log)
{
}
public void BeforeMergeAll(CancellableTraceLog log)
{
}
public void BeforeMin(CancellableTraceLog log)
{
}
public void BeforeMinAll(CancellableTraceLog log)
{
}
public void BeforeQuery(CancellableTraceLog log)
{
}
public void BeforeQueryAll(CancellableTraceLog log)
{
}
public void BeforeQueryMultiple(CancellableTraceLog log)
{
}
public void BeforeSum(CancellableTraceLog log)
{
}
public void BeforeSumAll(CancellableTraceLog log)
{
}
public void BeforeTruncate(CancellableTraceLog log)
{
}
public void BeforeUpdate(CancellableTraceLog log)
{
}
public void BeforeUpdateAll(CancellableTraceLog log)
{
}
}
}
Once the CustomTrace
class is ready, I will go ahead and update the Startup
class to configure the CustomTrace
class to the dependency injection container.
services.AddSingleton<ITrace, CustomTrace>();
Finally, I will update WareHouseUpdator
class to inject the ITrace
and use it.
using Microsoft.Data.SqlClient;
using RepoDb.Interfaces;
namespace RepoDb.Demo
{
public class WareHouseUpdator : IWareHouseUpdator
{
private readonly string connectionString;
private readonly ITrace trace;
public WareHouseUpdator(string connectionString,
ITrace trace)
{
this.connectionString = connectionString;
this.trace = trace;
}
public void Update(Warehouse warehouse)
{
using var connection = new SqlConnection(connectionString);
connection.Update(warehouse, trace: trace);
}
}
}
Now if I update a warehouse using this class, I will see all the information logged in the console.
Conclusion
So far from my experience, I can recommend RepoDb as a fantastic alternative for Dapper as well as Entity Framework Core. It provides the simplicity of Dapper. And at the same time provides some of the rich features similar to Entity Framework Core.
I have done a couple of YouTube videos to walk through the implementation of RepoDb. Here are the links to those videos:
- RepoDb Hybrid ORM (An Introduction in ASP.Net Core Application)
- Cache and Trace in RepoDb (.Net Core)
The source code for this blog post is available in GitHub here.