Skip to content

chunk option for create_batch() #1115

@Zerotask

Description

@Zerotask

The problem

If you want to create a lot of objects, e. g. you want to create 2000 objects, the method can take very long (multiple minutes) and eventually can run out of memory or other issues.

Proposed solution

I'd like to have chunking either enabled by default or with an optional chunk_size parameter.
For example instead of processing 2000 objects at once, I'd like to have multiple iterations (e.g. 20 iterations á 100 objects). This way I get faster feedback and even if it aborts due to memory issues etc. I still have some objects created.
To avoid a breaking change / unwanted behavior, the implementation could be a new parameter chunk_size: int = 0. If it's 0 (or smaller than 0), no chunking will be used. So the default behavior won't change.
Chunking will only be applied if chunk_size > 0 and count > chunk_size.

Alternative considerations would be the use of a generator instead of a list comprehension:

return [cls.create(**kwargs) for _ in range(size)]
for _ in range(size):
        yield cls.create(**kwargs)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions