This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I've noticed when creating a large amount (around 25000) of new ORM objects in a loop, (in this case Treebeard MP_nodes) Django begins eating up more and more of the machines ram in gigabytes as it iterates through an in-memory dictionary and creates these models, and i can't wrap my head around why it does so.
I've run into this before when creating objects in a migration, so i don't believe its specific to the Treebeard library. I was wondering if any of you guys have run into the same issue when creating a large amount of models without bulk create.
Obviously when doing this you should use bulk_create, But this is a limitation of the library which doesn't have a method to bulk create the nodes of the tree at once. I can get around this by using some private methods, but I'd rather not unless I have to.
Post Details
- Posted
- 5 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/django/comm...