1

I would like to do some memory profiling on a python module that is called in an embedded fashion (from C++). There is some suspicion that parts of it are considerably less than optimal, but it is a fairly complicated module, making manual inspection quite cumbersome. I can change the python code if needed (e.g. to add @profile function decorators etc). Ideally I would like to profile certain functions within these modules.

I have been looking at a few options, e.g. memory_profiler, but I cannot figure out how to get these to work in an embedded fashion (they work fine when testing on stand-alone scripts). The question is if anyone knows of a way to approach this problem.

I understand this question is perhaps somewhat vague, but a wider search online did not really come up with any concrete strategies on how to approach this. Even so, I expect this to be an issue that may be relevant to more people.

1 Answer 1

2

I was too hasty when asking this question. The concerns I addressed (i.e. memory_profiler not being able to do this) were premature. Our code actually redirects the output somewhere else - and it's all there.

In short - the following works perfectly well (as also explained in the memory_profiler docs):

from memory_profiler import profile

@profile
def your_function(...):

and a nice line-by-line memory report will be shown. I apologise for the premature question - and can wholeheartedly recommend memory_profiler for this purpose.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.