i have data in container of parquet files for every hour in container i have 6 folder say a,b,c,d,e,f and same name table has been created in adx and in container folder a/year=2025/month=01/day=01/hour=00/.parquet file this is how data is stored but if i send every hour data directly i am creating a lot of data every hour which is mostly same in azure data explorer but a lot of redundant information is getting collected what i want is a strategy by which we only store the data whenever there is some changes in the data which is coming from parquet.
i have a solution using staging table but again i will be storing a lot of data in staging table so no use and i want the automated solution.
please suggest the best tool and strategy.