Skip to content

Poor Memory Optimization when saving large Projects #25181

@rectrap

Description

@rectrap

Description

I was analyzing a large binary (~70MB) and wanted to save the session after I had executed "aa", "aaa", and "aaaa" because they took quite some time, running "Ps".

r2 exited with "killed" due to being out of memory, as evident by the dmesg

[2881260.202621] Out of memory: Killed process 454123 (r2) total-vm:7042300kB, anon-rss:6668924kB, file-rss:264kB, shmem-rss:0kB, UID:1000 pgtables:13336kB oom_score_adj:0

I'd like to suggest better memory management during large file operations. I do not know the internals of radare2, but it feels it could be improved.

Metadata

Metadata

Assignees

No one assigned

    Labels

    projectsLoading, saving and handling radare2 project files

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions