I have a project that :
- fetches data from active directory
- fetches data from different services based on active directory data
- aggregates data
- about 50000 row have to be added to database in every 15 min
I'm using Postgresql as database and django as ORM tool. But I'm not sure that django is the right tools for such projects. I have to drop and add 50000 rows data and I'm worry about performance. Is there another way to do such process?