-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] [report v2019.2.3, v3000] 25% memory usage on Raspberry Pi (Ubuntu 64bit) #56206
Comments
That is a surprising amount of memory. You might see if you can update the number of worker processes to just 1. I've used minions and masters on my pi before - the only real problem that I saw was that occasionally it would get into a runaway loop that would start eating all my memory, but that's a different issue. |
Hello Wayne, thanks for your hints! I have tried to disable multithreading as follows, but now effect. I still got 3 minion processes. Maybe, it's a bug in the current release 🤭
This runaway loop sound bad 😕did you find a solution for this? ulimit? |
Upgrading to version 3000 did not make a difference. |
Try disabling some execution modules and grains using the You can find the list of loaded modules using the following command: salt RASPBERRYPI sys.list_modules Then add the modules you don't need to the minion config: disable_modules:
- consul
- glassfish
- grafana4
- jboss7
- ...
- vsphere As for grains, you can find the full list of grains modules here, and also disable the ones you don't need: disable_grains:
- chronos
- cimc
- esxi
- fibre_channel
- fx2
- iscsi
- junos
- marathon
- mdadm
- mdata
- metadata
- napalm
- nvme
- nxos
- panos
- philips_hue
- rest_sample
- smartos
- ssh_sample
- zfs This can also considerably speed up the minion response time: #48773 (comment) |
Thanks for your further hints! I tried to disable as many modules as possible but it did not make a difference ... now the three salt-minion processes use 27% of the memory 🙈 My config is as follows now:
|
Sorry, small offtop: is progress bar on your screen broken (for top)? Why do we see 60.9/908.2 MB? |
@4815162342lost |
@sjkummer You disabled too many modules. You'll probably want to run things like As for 27%, I'm quite surprised. The memory usage should go down. I'll try it on my Pi in a few days. |
Btw. I also switched from python 2.7 to python 3 ... but no change |
Ok, it looks like I actually have a PI I can access, and below are my results (Raspbian Buster Lite, Salt 3000 Py3): Default installMeasurements right after the
With disabled modules/grains/etc/salt/minion.d/disable.conf
Measurements right after the
So, the RES memory usage is a bit lower:
And the minion response time is significantly reduced. What I don't understand is why the RES memory consumption is much higher on your screenshots. Maybe Ubuntu 19.10 is less optimized for PI than Raspbian Buster? |
Hey @max-arnold thanks for your reproduction! That is really interesting. I now disabled exactly the same list, but I still got the same high memory usage. This really could be related to the Ubuntu 64bit environment. I also had to install salt using git Just one thing: I double checked the modules using |
I see the same thing. It happens because /etc/salt/minion.d/disable.confdisable_grains:
- chronos
- cimc
- esxi
- fibre_channel
- fx2
- iscsi
- junos
- marathon
- mdadm
- mdata
- metadata
- napalm
- nvme
- nxos
- panos
- philips_hue
- rest_sample
- smartos
- ssh_sample
- zfs
disable_modules:
- aliases
- alternatives
- ansiblegate
- artifactory
- bigip
- btrfs
- zcbuildout
- chroot
- cloud
- composer
- consul
- container_resource
- cpan
- cryptdev
- data
- dig
- djangomod
- dnsmasq
- drbd
- etcd_mod
- ethtool
- extfs
- freezer
- gem
- genesis
- git
- glassfish
- google_chat
- grafana4
- highstate_doc
- hipchat
- incron
- ini_manage
- inspector
- introspect
- iosconfig
- iptables
- iwtools
- jboss7
- jboss7_cli
- k8s
- key
- keyboard
- kmod
- locate
- logrotate
- lvm
- mandrill
- mattermost
- modjk
- msteams
- nagios
- nagios_rpc
- namecheap_domains
- namecheap_domains_dns
- namecheap_domains_ns
- namecheap_ssl
- namecheap_users
- nexus
- nfs3
- nginx
- nova
- npm
- nspawn
- nxos_api
- openscap
- openstack_config
- opsgenie
- pagerduty
- pagerduty_util
- parallels
- peeringdb
- publish
- pushover_notify
- pyenv
- raid
- random
- random_org
- rbenv
- rest_sample_utils
- rsync
- rvm
- s3
- s6
- salt_proxy
- saltcheck
- scsi
- sdb
- seed
- serverdensity_device
- slack_notify
- smbios
- smtp
- solrcloud
- sqlite3
- ssh
- statuspage
- supervisord
- syslog_ng
- telegram
- telemetry
- tls
- travisci
- vault
- vbox_guest
- virtualenv_mod
- vsphere
- x509
- xml
- xfs
- zabbix
- zenoss |
And I guess you are right - the increased memory consumption is caused by 64 bit arch. salt-call --versions-report
|
@max-arnold so you're not seeing the same amount of memory usage? |
@waynew Yes, that is correct. Earlier I posted all the stats for a 32bit architecture. |
@sjkummer Running on a pi2 with raspbian 10 only used about 50M with the default 3000 install. It does seem possible that the 64-bit nature is what's causing increased memory consumption. Unfortunately I don't have access to a 64-bit pi right now. |
trying to move this forward, I will need to follow up tomorrow or Monday. |
@sjkummer or @max-arnold are either of you able to open a PR for this? This likely isn't something the Open Core Team can focus on right now, but we can keep it open for the community to pick up. I will leave this as-is and follow up again the next business day. |
I have no idea how to reduce memory usage (other than switching to a 32-bit OS), so I can't provide a PR |
Update |
putting this into |
this needs a 64-bit RaspberryPi to test, I will attempt to assign this out to get it unblocked. |
We are able to test this now and may be that we have fixed in v3003.1. Putting this back through triage to test. |
The Silicon release of Salt will have official Ubuntu ARM 64bit packages for LTS releases and we will be testing this OS as part of the testing pipeline. Based on the comments that memory usage was significantly reduced in 3001, we'll go ahead and close this one out. Please feel free to open a new issue if the problem persists. |
Description of Issue
I am running the latest release v2019.2.3 on a Raspberry Pi 3 B and it uses nearly 25% of the memory (250MB of 1GB). There are no running task - the minion is "idle".
Why is so much memory needed?
Setup
Saltstack minion on a Raspberry Pi 3 B V1.2
Versions Report
Salt Version:
Salt: 2019.2.3
Dependency Versions:
cffi: Not Installed
cherrypy: Not Installed
dateutil: Not Installed
docker-py: Not Installed
gitdb: Not Installed
gitpython: Not Installed
ioflo: Not Installed
Jinja2: 2.10
libgit2: Not Installed
libnacl: Not Installed
M2Crypto: 0.31.0
Mako: Not Installed
msgpack-pure: Not Installed
msgpack-python: 0.5.6
mysql-python: Not Installed
pycparser: Not Installed
pycrypto: 2.6.1
pycryptodome: Not Installed
pygit2: Not Installed
Python: 2.7.17 (default, Nov 7 2019, 10:07:09)
python-gnupg: Not Installed
PyYAML: 5.1.2
PyZMQ: 17.1.2
RAET: Not Installed
smmap: Not Installed
timelib: Not Installed
Tornado: 5.1.1
ZMQ: 4.3.2
System Versions:
dist: Ubuntu 19.10 eoan
locale: UTF-8
machine: aarch64
release: 5.3.0-1007-raspi2
system: Linux
version: Ubuntu 19.10 eoan
The text was updated successfully, but these errors were encountered: