1

I have lagecy ASP.Net code which accesses database. There is data access layer which forms sqlcommands and executes on the database.

What is the best way to unit test the data access layer? Should we actually connect to database and execute test case or just use fakes?

Is it a good idea to use shim (described in below post)?

http://msdn.microsoft.com/en-us/library/hh549176.aspx

2 Answers 2

2

Assume your legacy DLL is managed, you should be able to use Fakes feature in VS2012. Fakes is really meant for doing this. A typical usage of Fakes works like:

  1. Create a new unit test project
  2. Add a reference to this legacy DLLs (e.g. Legacy.DLL). Make sure all the dependent DLLs are referenced in this unit test projects.
  3. Right click Legacy.DLL in the solution Reference folder, choose "Add Fakes Assembly". This generates shims for types defined in Legacy.DLL.
  4. Also add a reference to your project code (Assume you want to unit test your product method)
  5. In the TestMethod1, you can start shimming method defined in Legacy.DLL and test your product code.

You can also find useful info on http://msdn.microsoft.com/en-us/library/hh708916.aspx

Sign up to request clarification or add additional context in comments.

1 Comment

2

The best way to test a data access layer is to write integration tests that actually connect to the database. It is NOT a good idea to use fakes (whether it's Microsoft Fakes or any other test isolation framework). Doing so would prevent you from verifying the query logic in your data access layer, which is why you'd want to test it in the first place.

With granular integration tests hitting a local SQL database via the shared memory protocol, you can easily execute hundreds of tests per minute. However, each test must be responsible for creating its own test environment (i.e. test records in tables it accesses) and cleaning it up in order to allow reliable test execution. If your data access layer does not manage transactions explicitly, start by using TransactionScope to automatically roll back all changes at the end of each test. This is the simplest and the best option, however if that does not work (if your legacy code manages transactions internally), try deleting data left by previous tests in the beginning of each test. Alternatively, you can ensure that tests don't affect each other by always using new and unique primary keys for all records in every test. This way you can cleanup the test database once per batch instaead of once per test and improve performance.

2 Comments

I always find that in a data-driven asp.net mvc application (means almost all the business logic is in the data operations - linq or sql statement), unit testing those mvc controllers is so time consuming and yet fruitless - it can do nothing but ensure that the data access method are called, however introduce so many code compexicty.
@OlegSych - I'm really surprised by the comment: "It is NOT a good idea to use fakes. Doing so would prevent you from verifying the query logic in your data access layer, which is why you'd want to test it in the first place." See I would disagree and advocate for both integration tests as you mention, but also for true unit tests using some sort of Fake. One does not always want to run full integration tests as they are often slower and may not always pass (i.e. database is remote and working offline, so can't access). Unit tests will use stubs in place of dependency solving this issue.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.