I'm solving a problem using step function workflows.
The problem goes like this, I have a worklow of 10 AWS Batch jobs.
The first 3 jobs run in sequence and the 4-7 jobs are dynamic steps i.e, they needs to run multiple times with different parameters as specified.
And for each 4-5-6-7 workflow, there are multiple executions of 8-9-10 jobs based on number of parameters.
Looks like Map is best possible fit here but if any of the job fails in map state of 4-5-6-7 entire step fails. I don't want one execution to effect the other execution.
Approach: I have designed 3 step functions. First step function runs 1-3 jobs and last step calls for a lambda function which submits multiple executions of 4-5-6-7 jobs. And for each 4-5-6-7 execution another lambda gets triggered to submit multiple executions of 8-9-10 jobs.
I'm connecting the step functions manually through lambda functions.
Is this the correct approach or are there better ways of doing it?