I think, unless you have some crazy log traffic, that if you can get 4GB for ELK in an SMB, you are nearly always good. I'd expect hundreds of servers to be able to log to that, as long as you have fast disks (it still has to get to disk fast enough no matter how much memory there is.)

We've had massive Splunk databases with 32GB - 64GB, but those are taking data from thousands and thousands of servers and doing so as a high availability failover cluster, so they have to ingest, index and replicate in real time.