Skip to content

problems in proxmox kvm with min/max memory balloon #2100

@everanium

Description

@everanium

Hello, i try to use memory balloon on proxmox with nanovms and i quickly get several problems, i know, now memory balloon support is a new feature in NanoVMS, but may be this problems is interesting:

My usd app after start (without traffic load) consume by default ~ 900MB memory, its normal, because buffer = 2097152 option in my usd.conf used for creating 4 static sized preallocated arrays in my usd app source code:

s1MemStat.Buffer = make([]MData, buffer)
s2MemStat.Buffer = make([]MData, buffer)
r1RedStat.Buffer = make([]RData, buffer)
r2RedStat.Buffer = make([]RData, buffer)

Apr 20 10:54:15 10.100.0.25 usd [103.461604] id heap 0xffff800000000000: gap [0x0 0x97) found while deallocating [0x0 0x200)
Apr 20 10:54:15 10.100.0.25 usd [103.462231] id_dealloc failed, ra 0xffffffff8e7a7eb6
Apr 20 10:54:15 10.100.0.25 usd [103.462633] id heap 0xffff800000000000: gap [0xffe00 0x100000) found while deallocating [0xffe00 0x100000)
Apr 20 10:54:15 10.100.0.25 usd [103.463243] id_dealloc failed, ra 0xffffffff8e7a7eb6
Apr 20 10:54:15 10.100.0.25 usd [103.463644] id heap 0xffff800000000000: gap [0xffc00 0xffe00) found while deallocating [0xffc00 0xffe00)
Apr 20 10:54:15 10.100.0.25 usd [103.464239] id_dealloc failed, ra 0xffffffff8e7a7eb6
Apr 20 10:54:15 10.100.0.25 usd [103.464635] id heap 0xffff800000000000: gap [0xffa00 0xffc00) found while deallocating [0xffa00 0xffc00)
Apr 20 10:54:15 10.100.0.25 usd [103.465233] id_dealloc failed, ra 0xffffffff8e7a7eb6
Apr 20 10:54:15 10.100.0.25 usd [103.465637] id heap 0xffff800000000000: gap [0xff800 0xffa00) found while deallocating [0xff800 0xffa00)
Apr 20 10:54:15 10.100.0.25 usd [103.466247] id_dealloc failed, ra 0xffffffff8e7a7eb6
Apr 20 10:54:15 10.100.0.25 usd [103.466650] id heap 0xffff800000000000: gap [0xff600 0xff800) found while deallocating [0xff600 0xff800)
Apr 20 10:57:45 10.100.0.25 usd [6.758127] id_dealloc failed, ra 0xffffffff85a98eb6
Apr 20 10:57:45 10.100.0.25 usd [6.758514] id heap 0xffff800000000000: gap [0x7ffd3 0x80000) found while deallocating [0x7fe00 0x80000)
Apr 20 10:57:45 10.100.0.25 usd [6.759109] id_dealloc failed, ra 0xffffffff85a98eb6
Apr 20 10:57:45 10.100.0.25 usd [6.759583] id heap 0xffff800000000000: gap [0x7fa00 0x7fbd3) found while deallocating [0x7fa00 0x7fc00)
Apr 20 10:57:45 10.100.0.25 usd [6.760171] id_dealloc failed, ra 0xffffffff85a98eb6
Apr 20 10:57:45 10.100.0.25 usd [6.760556] id heap 0xffff800000000000: gap [0x7f925 0x7fa00) found while deallocating [0x7f800 0x7fa00)
Apr 20 10:57:45 10.100.0.25 usd [6.761143] id_dealloc failed, ra 0xffffffff85a98eb6
Apr 20 10:58:25 10.100.0.25 usd #012frame trace:
Apr 20 10:58:25 10.100.0.25 usd ffffc00002827f50
Apr 20 10:58:25 10.100.0.25 usd :
Apr 20 10:58:25 10.100.0.25 usd ffffffff85a7f52b
Apr 20 10:58:25 10.100.0.25 usd #011(
Apr 20 10:58:25 10.100.0.25 usd runloop_internal
Apr 20 10:58:25 10.100.0.25 usd +
Apr 20 10:58:25 10.100.0.25 usd 000000000000018b
Apr 20 10:58:25 10.100.0.25 usd /
Apr 20 10:58:25 10.100.0.25 usd 00000000000009ea
Apr 20 10:58:25 10.100.0.25 usd )
Apr 20 10:58:25 10.100.0.25 usd
Apr 20 10:58:25 10.100.0.25 usd ffffc00002827fc0
Apr 20 10:58:25 10.100.0.25 usd :
Apr 20 10:58:25 10.100.0.25 usd ffffffff85a686c3
Apr 20 10:58:25 10.100.0.25 usd #011(
Apr 20 10:58:25 10.100.0.25 usd context_switch_finish
Apr 20 10:58:25 10.100.0.25 usd +
Apr 20 10:58:25 10.100.0.25 usd 0000000000000073
Apr 20 10:58:25 10.100.0.25 usd /
Apr 20 10:58:25 10.100.0.25 usd 0000000000000211
Apr 20 10:58:25 10.100.0.25 usd )
Apr 20 10:58:25 10.100.0.25 usd
Apr 20 10:58:25 10.100.0.25 usd #012kernel load offset
Apr 20 10:58:25 10.100.0.25 usd ffffffff85821000
Apr 20 10:58:25 10.100.0.25 usd
Apr 20 10:58:25 10.100.0.25 usd #012loaded klibs:
Apr 20 10:58:25 10.100.0.25 usd ntp
Apr 20 10:58:25 10.100.0.25 usd @0x
Apr 20 10:58:25 10.100.0.25 usd ffffffffa1c3a000
Apr 20 10:58:25 10.100.0.25 usd /0x
Apr 20 10:58:25 10.100.0.25 usd 9000
Apr 20 10:58:25 10.100.0.25 usd
Apr 20 10:58:25 10.100.0.25 usd tls
Apr 20 10:58:25 10.100.0.25 usd @0x
Apr 20 10:58:25 10.100.0.25 usd ffffffffc2c52000
Apr 20 10:58:25 10.100.0.25 usd /0x
Apr 20 10:58:25 10.100.0.25 usd 92000
Apr 20 10:58:25 10.100.0.25 usd
Apr 20 10:58:25 10.100.0.25 usd syslog
Apr 20 10:58:25 10.100.0.25 usd @0x
Apr 20 10:58:25 10.100.0.25 usd ffffffff9795e000
Apr 20 10:58:25 10.100.0.25 usd /0x
Apr 20 10:58:25 10.100.0.25 usd 6000
Apr 20 10:58:25 10.100.0.25 usd
Apr 20 10:58:25 10.100.0.25 usd
Apr 20 10:58:25 10.100.0.25 usd assertion p->prev && p->next failed at /home/eyberg/go/src/github.com/nanovms/nanos/src/runtime/list.h:45 (IP 0xffffffff85b1da2d) in list_delete()

For quickly reproduce you can use my prepared test and bench environment in this issue: #2099

Settings in proxmox:

I try to use different memory settings:

256MB/8GB
1GB/8GB
4GB/8GB

Image

Also little strange here:

With settings 4GB/8GB, immediately after start VM:

Image

With settings 1GB/8GB, immediately after start VM:

Image

With settings 4GB (without min/max memory), immediately after start VM:

In this mode no have crashes like above.

Image

Image

But in all cases i check how much memory consumed on host proxmox machine, ps aux | grep kvm | grep usd shows me normal memory consumption - 0.3% (~900MB) of 256GB of host machine memory.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions