[Greenhouse] Memory profiling tips
Asheesh Laroia
asheesh at asheesh.org
Wed Jul 17 00:14:46 UTC 2013
Hi Dave, and anyone else interested,
Dave was noticing that the UDD import code gets killed on Dotcloud,
presumably by running out of RAM.
As in all programming, to fix this bug, one must know where it is.
Therefore, I suggest being able to reproduce it locally. To do that, here
is a quick Asheesh tutorial on memory profiling with Python.
Do all this on a machine with oodles of RAM; we'll use this as a way to
get the stats on a working version of the program.
First, install 'memory_profiler' into your local machine/virtualenv. e.g.
$ pip install memory_profiler
Then, visit https://pypi.python.org/pypi/memory_profiler and create a file
called example.com with the contents suggested in the "Usage" section.
Then, run "python -m memory_profiler example.py" and make sure you can see
the same table they show.
So far, so good?
Now make sure you undersetand it! Read carefully or ask questions.
Finally, temporarily hack your Django code so that the management
command's handle() method has this @profile decorator applied to it, and
run it via 'python -m memory_profiler manage.py my_management_command'.
Then you should get a table like that.
Is there one line that sees a massive spike in memory usage that isn't
what you'd expect?
If so, then maybe we can make that one line perform better!
-- Asheesh.
More information about the Greenhouse
mailing list