The best way to test a data access layer is to write integration tests that actually connect to the database. It is NOT a good idea to use fakes (whether it's Microsoft Fakes or any other test isolation framework). Doing so would prevent you from verifying the query logic in your data access layer, which is why you'd want to test it in the first place.
With granular integration tests hitting a local SQL database via the shared memory protocol, you can easily execute hundreds of tests per minute. However, each test must be responsible for creating its own test environment (i.e. test records in tables it accesses) and cleaning it up in order to allow reliable test execution. If your data access layer does not manage transactions explicitly, start by using TransactionScope to automatically roll back all changes at the end of each test. This is the simplest and the best option, however if that does not work (if your legacy code manages transactions internally), try deleting data left by previous tests in the beginning of each test. Alternatively, you can ensure that tests don't affect each other by always using new and unique primary keys for all records in every test. This way you can cleanup the test database once per batch instaead of once per test and improve performance.