I'm doing a program that is supposed to load a graph with 200k vertex and 500k edges. Then, I'm required to look into the vertexes that are near to some distance of a vertex (they all have a specific ubication). I'm doing this through recursion (DFS), however, for a sufficient big amount, the recursion I have to get through is large enough to crash the program (stack overflow). I've been reading about it and there's a common consensus about this error being a Bad Programming issue, and not a size of the data issue. I can increment stack space and it works, however I would like to know whether I could change my DFS implementation to make it less heavy for the stack. I was also wondering if the "weight" of each element in the stack is determined by the recursion (is it creating new variables? What is going on for it to crash?)
If someone could help me with this I would be really thankful!!