1

Is it a good practice to just import the models of my Django app in the secondary app and query the database? Does it have any performance issues or something? Actually the second application is a simple lightweight websocket server.

2
  • Possible duplicate of How to share a Django model? Commented Mar 9, 2017 at 22:04
  • django models can only be used by a django app. You websocket service is most probably not written in django so you won't be able to use these models Commented Mar 9, 2017 at 22:46

2 Answers 2

1

I've answered, although this might be better served on the Programmers SE.

Tl;dr: No, it's not good practice, but it doesn't necessarily lead to performance issues. It does make everything more complicated though. Avoid it if you can. I have a system that does this, and I'm trying to get away from this setup.

Now firstly, as Apero said in comments:

Django models can only be used by a Django app. Your websocket service is most probably not written in Django so you won't be able to use these models

But I can offer some context and practicality on this:

Having multiple systems or apps doesn't necessarily lead to performance issues, any more than having multiple app instances does, subject to the caveats below.

Your main issue is whether the systems can open and maintain connections and queries, and whether you have any particularly long-running or intensive queries, and how those overlap between those different systems. Do you have easy visibility of that?

We have a mid sized legacy system CRM database (250 users, average of 3 pages/second throughout the day, around 30Gb of data, in MySQL), that has:

  • 8 web workers
  • 30+ cronjobs within the Django framework management commands that access the database (so create their own connections to the db), some as frequently as every minute.
  • 4 or 5 'standalone' systems (reporting tools including superset and the like that again create their own connections.

We run all of that on a T2 Micro instance on Amazon Web Services. It never really gets particularly taxed, except when we're running the large scale batch jobs (inserting 250k records into our 110m row analytics table, which is poorly designed and not properly indexed, but that's another tale). That gets run overnight as it tends to slow everything down.

But you asked about best practice.

This answer is a good overview of some of the pitfalls

In the example I've described above, we have a number of areas where the databases overlap in terms of updates. This is bad, and makes it difficult to debug what's happened. Was it the finance system or the CRM that changed that record?

Broadly, the other (non CRM) apps are running reporting systems - so just SELECT statements, effectively. With that in mind, one of the first things when I arrived was put a read-replica in place for reporting systems and shifted them over to that. Now even if someone runs a text keyword search in our biggest tables, it doesn't affect the performance of the main CRM system in terms of updating customer records and other INSERT or UPDATE queries.

Sign up to request clarification or add additional context in comments.

Comments

0

you might run into database performance issues if the traffic from both apps overloads the database but that's a solvable problem

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.