TestDataDefinitionFramework (or TDDF for short) is a library that abstracts the setup of test data from the implementation of data backing stores, so that the same test code can be run against both an in-memory/fake repository or a "real" repository using only pre-compiler conditionals to switch between the two.
An immediate design limitation is that this method will ONLY work when you are using "dumb data stores", so if you are running multi-table SQL queries or stored procedures, then this is not the library for you!
However, if your interactions with your data layer are usually that of "store this entity in this table", "query this entity from this table" then read on!
Ordinarily when writing integration tests (for example using SpecFlow) you will decide; "is this going to run against an in-memory fake, or against the "real" repository".
There are pros and cons to deciding on one: "in-memory" is fast and can run on the build server, but doesn't thoroughly test your data provider layer. Using "real" repositories more thoroughly tests your code layers as it proves the integration of your code with the chosen data storage engine, but is slower and requires connectivity to a running instance of your data storage engine of choice (e.g. a MongoDB or SQL instance).
Ideally it's nice to have both options, but a lot times this leads to a duplication of the integration test code - in SpecFlow terms the "step definitions" can look very different when you want to "setup" a MongoDB than when you only want to setup an in-memory context and pass that to an interceptor/fake repository.
The idea of TDDF is that by setting your test data against the TestDataStore you can then use that same data in an in-memory repository as easily as enabling a "real" backing store and the TDDF plugins will take care of actually standing up the "real" data resource.
- Clone the source code repository and build, or install packages via NuGet.
- Add a reference to "TestDataDefinitionFramework.Core" into your "integration tests" project (for example your SpecFlow/NUnit project).
- Choose which backing stores plugins you want to use (e.g. TestDataDefinitionFramework.MongoDB)
- Wire-up the code:
- Configure and Initialize (do this before your tests run), e.g. in SpecFlow
[BeforeTestRun]
public static async Task Initialize()
{
var mongoBackingStore = new MongoBackingStore("ExampleSutDB");
TestDataStore.AddRepository<SummaryItem>(cfg =>
{
cfg.WithName(SummaryCollection.Name);
#if UseRealProvider
cfg.WithBackingStore(mongoBackingStore);
#endif
});
await TestDataStore.InitializeAllAsync();
}
- Use the TestDataStore for your setting up your test data, e.g. in SpecFlow
[Given(@"the summaries repository returns '(.*)'")]
public void GivenTheSummariesRepositoryReturns(IReadOnlyList<string> items)
{
TestDataStore.Repository<SummaryItem>(SummaryCollection.Name).Items = items.Select(i => new SummaryItem {Name = i}).ToArray();
}
- Add your in-memory repository fakes (for when running in-memory mode):
public class WebTestFixture : WebApplicationFactory<Startup>
{
protected override void ConfigureWebHost(IWebHostBuilder builder)
{
base.ConfigureWebHost(builder);
builder.UseEnvironment("Testing");
builder.ConfigureTestServices(services =>
{
#if !UseRealProvider
services.AddTransient<ISummariesRepository, InMemorySummariesRepository>();
#endif
});
}
}
- Implement your in-memory repository:
public class InMemorySummariesRepository : ISummariesRepository
{
public Task<IReadOnlyList<string>> GetAllAsync()
{
var result = TestDataStore.Repository<SummaryItem>(SummaryCollection.Name)
.Items?.Select(i => i.Name).ToArray()
?? Array.Empty<string>();
return Task.FromResult((IReadOnlyList<string>) result);
}
}
- Commit the test data before calling the SUT (the best way to acheive this in SpecFlow is to trigger before the "When" block)
[BeforeScenarioBlock]
public async Task Commit()
{
if (_scenarioContext.CurrentScenarioBlock == ScenarioBlock.When)
{
await TestDataStore.CommitAllAsync();
}
}
Now you can run your tests against an in-memory fake, or against a "real" repository by simply setting or unsetting a pre-compiler conditional called "UseRealProvider".
- See the ExampleTests project for a working version of the above
- On first run, when you haven't specified your own connection string, TDDF will spin up containers using Docker - so please ensure Docker is installed and be patient on the first run while images are downloaded.
- Provide your own connection string if you already have a MongoDB instance running (beware that collections will be dropped/re-created)
- If you don't provide a connection string, then the code will attempt to spin up a MongoDB instance on port 27017 using Docker Desktop (so this must be installed and that port must be free if you rely on this feature)
- The collections will be dropped and re-created on each commit, so please don't point this at a working MongoDB database!
- Make sure your "repository" names in TDDF match up with the collection name you use in the "real" repository
- Provide your own connection string if you already have a SQL instance running (beware that tables will be dropped/re-created)
- If you don't provide a connection string, then the code will attempt to spin up a SQL instance on port 1433 using Docker Desktop (so this must be installed and that port must be free if you rely on this feature)
- The database will be created when using the Docker version
- Tables will be dropped and re-created on each commit, so please don't point this at a working SQL database!
- Make sure your "repository" names in TDDF match up with the table name you use in the "real" repository
- All objects are created in the dbo namespace
- When using the built-in Docker hosted SQL server, in order to point the SUT at the correct connection string, override your configuration object in the WebTestFixture, e.g.
services.AddTransient<SqlDataStoreConfig>();
services.AddTransient<ISqlDataStoreConfig>(sp =>
{
var config = sp.GetRequiredService<SqlDataStoreConfig>();
config.ConnectionString = TestDataStore.Repository<SummaryDescription>().Config.BackingStore?
.ConnectionString ?? config.ConnectionString;
return config;
});
- The plugin uses "StringSet", so only supports SUTs that use StringGet to obtain data.
- Since Redis is a Key/Value store, you are required to provide the methods for serializing items into "string key" and "string value"
- The serialization method should match exactly how the SUT works, so it can deserialize the tests data successfully
- If your type doesn't naturally have a "key", then you can wrap it with a tuple, e.g.
var redisBackingStore = new RedisBackingStore(
new KeyValueResolver()
.WithResolver<(string Key, YourTypeHere Value)>(
item => (item.Key, sutRedisSerializer.Serialize(item.Value))
));
TestDataStore.AddRepository<(string Key, YourTypeHere Value)>(cfg =>
{
#if UseRealProvider
cfg.WithBackingStore(redisBackingStore);
#endif
});
If you want to build up an object across multiple steps and then "build" it as part of the commit, then hook in earlier than the TDDF commit to set the state in the TestDataStore rather than having to remove the builder, e.g.:
[Binding]
public class Context
{
private readonly ScenarioContext _scenarioContext;
public Context(ScenarioContext scenarioContext)
{
_scenarioContext = scenarioContext;
}
public MyClassDataBuilder MyClassDataBuilder { get; set; }
[BeforeScenarioBlock(Order = 0)]
public void BeforeCommit()
{
if (_scenarioContext.CurrentScenarioBlock == ScenarioBlock.When)
{
var myClassInstance = MyClassDataBuilder?.Build();
TestDataStore.Repository<MyClass>().Items =
myClassInstance != null ?
new[] { myClassInstance } :
Array.Empty<MyClass>();
}
}
}
Sometimes you'd like to capture calls that were made to your provider layer so that you can make assertions about what was called and with what data. Obviously by swapping out the interceptor to the "real" provider you lose this functionality (unless you could make the same assertion against the "real" repository, but that seems like a bigger problem).
The solution when using TDDF is not to remove your interceptor when switching to "real" mode, but instead to use the interceptor class as a decorator over the "real" implementation and inject the "real" class only when running in that mode. For example:
protected override void ConfigureWebHost(IWebHostBuilder builder)
{
base.ConfigureWebHost(builder);
builder.UseEnvironment("Testing");
builder.ConfigureTestServices(services =>
{
services.AddTransient<IMyDataStore, MyDataStoreInterceptor>(); // <-- always use the interceptor
#if UseRealProvider
services.AddTransient<RealMyDataStore>(); // <-- when "real" mode, register the real implementation with .net DI
#endif
}
}
public class MyDataStoreInterceptor : IMyDataStore
{
private readonly InterceptorsDataContext _interceptorsDataContext;
private readonly RealMyDataStore _realDataStore;
public MyDataStoreInterceptor(InterceptorsDataContext interceptorsDataContext, RealMyDataStore realDataStore = null) // <-- null when running in memory
{
_interceptorsDataContext = interceptorsDataContext;
_realDataStore = realDataStore;
}
public Task StoreAsync(MyClass data)
{
_interceptorsDataContext.MyDataStoreContext.StoredData = data;
return _realDataStore != null ?
_realDataStore.StoreAsync(data) :
Task.CompletedTask;
}
public Task<MyClass> GetAsync(string reference)
{
return _realDataStore != null ?
_realDataStore.GetAsync(reference) :
Task.FromResult(TestDataStore.Repository<MyClass>().Items?.FirstOrDefault(i => i.Reference == reference))
}
}
As you can see this repository is still in it's infancy and so far I've only needed to create a few plugins. Feel free to create your own plugins and raise a merge request so this can grow in it's usefulness!