1

I am struggling to decide on the best approach for using my database in my unit tests. I am using XUnit for the tests and my database model is code first with entity framework.

The options I have are InMemory and Real Database, both scaffolded by the entity framework context.

Both seem to have issues though:

InMemory - Good because it can spin up database instances and destroy them fast in memory. Bad because it doesn't find native SQL bugs and can't test functions that run raw SQL commands.

Real - Good because it can find native SQL bugs and test functions that run raw SQL commands. Bad because DbContext is not thread safe and and XUnit runs tests asynchronously which means I would need to spin up a new real database for each test (timely, and messy).

Which is the preferred solution here? Both seem to have pros and cons. Is there a best solution here I am missing?

3 Answers 3

4

I recommend the pyramid approach. At the bottom of the pyramid are unit tests, they are numerous and your quickest way to get any feedback on code you just wrote. On top of that, are your integration tests, which are far fewer. Then, on top of that are full regression tests then smoke tests, with each slice of the pyramid getting smaller.

Having said all this, why not both? Think of InMemory as unit tests, and Real as integration tests.

At the very least I recommend unit tests with an InMemory database. Yes, unit tests are isolated and do not catch all the bugs but the benefit here is you now have a solid testing foundation to build upon. Unit testing can also run as part your build, and if someone happens to break something by accident this will catch it like a safety net.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks for this response. It makes a lot of sense.
0

The current documentation for Entity Framework suggests the usage of a real database.

The main reasons are:

  • Launching a real database is currently very easy, using, for example, Docker containers.

  • By using a local database, communication overhead is negligible.

  • Mocked databases may not replicate properly the production database behavior, leading to tests that would fail if ran against the production database and to bugged logic.

3 Comments

That does not eliminate the primary issues with unit tests: (1) Starting the database and setting up a schema takes time, even if the article states that it's extremely fast (it's not). (2) The database shares state across tests. Both can be mitigated but that's non-trivial. I think that the original accepted answer is generally a better call. In my experience, the answer is 'it depends'.
@ZdeněkJelínek The EF team themselves strongly discourage using the in-memory provider. As a developer, I would err on the side of caution and never, ever use it.
@GertArnold Thanks for pointing that out. While I agree that the in-memory provider is not a good approach, I have a bunch of colleagues who use it and it works for them in their context (i.e. very large DB schema, not too high quality expectations :)). That being said, I wouldn't use it either. But I wouldn't use LocalDB and such for unit tests either.
-1

Both have their pros and cons.

InMemory is super fast and great for quick logic checks, but it doesn’t behave like a real database — no real SQL, no constraints, so you might miss bugs...

Real database gives you the full picture and catches SQL-related issues, but it’s slower and tricky to manage in parallel tests.

A nice middle ground is using Testcontainers — it spins up a real database (like SQL Server or Postgres) in Docker just for your tests, then tears it down automatically. So you get real behavior without the headache of managing it manually.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.