0

I have thousand of SQL Server databases (one for each client). When we decide to push on production, we have most of the time changes in databases, the web API and the web application.

The problem is the time it takes to deploy everything, especially the databases. We are using Code First migration and MVC .NET and SQL Server, all with the latest version. It is a SaaS. And the code first migration process is able to update the database one-by-one.

The API and the web application are deployed very quickly within a few seconds. However, the databases are all updated within about 30 minutes. During that time some users got errors and cannot use the software because the API tries to target non-updated database. And worse, if during the databases update, something fails and stop, the non-updated users are stuck until we fix the issue and update the rest of the databases.

Any idea how to solve this problem and make clients happy?

PS: The web application doesn't access to the database, but only the API.

2
  • 1
    If you have 1,000 databases there really isn't much you can do. It is going to take time to update all those databases. The only thing you could do would be to take the application offline during the update process....which really should be happening anyway. Find a window of time where you can do system upgrades and make sure your clients understand that window. Do your updates during that time frame and nobody should have any issues. Commented Dec 20, 2016 at 15:16
  • I'd also consider updating each client database one by one rather than update them all at once. Commented Dec 20, 2016 at 15:23

1 Answer 1

1

This question is somewhat opinion-based. The maintenance window approach is the easiest. If you want to do live-updating, another way would be:

  1. Keep a version number in the database
  2. Allow running multiple versions of the Web API side-by-side
  3. Choose which version of the API to use by looking at the version in the database
  4. Determine if the Web API's public interface is stable. If it is not, also find a way to allow running multiple web sites side-by-side and choose which one based on the version in the database

The most maintainable way to accomplish this would probably be to have at least 3 servers:

  1. One backend server which hosts the old version
  2. One backend server which hosts the new version
  3. The frontend server which routes users to the proper backend server based on the current version.

The routing could take place only at login, or you could do something more fancy such as redirecting the logged-in user when an upgrade is detected. Obviously none of this deals with what happens to one particular client during the actual upgrade of that client's database. You'll still need to address that separately.

Sign up to request clarification or add additional context in comments.

2 Comments

That's pretty good. However, the frontend has a lot of changes as well and so it should be versioned and it should come with a specific API version. But we are using the subdomain to select the client and so in that case the version to use at the login stage. So I guess we will need some sort of "login" website just to redirect to the right web server. But how I can route the "login" website transparently the to the right frontend server? So the domain will stay the same (we are using a wildcard SSL).
Why have the domain stay the same? Why would each server not have a subdomain? For example, one SaaS product my company uses has a www.company.com login page and a ww3.company.com frontend server. I assume they have a ww1 and ww2 server somewhere as well

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.