Hi guys,
We are a small software company that mainly serves to develop LAMP based business software for clients, and as a side effect of this we offer hosting for many clients in a shared environment.
It's not our main game so I'm assuming we have a fairly basic shared hosting setup using virtual hosts etc. It's on AWS and generally we have no issues with resources or performance.
We have had an issue with a client installing a plugin on their site which made it play a part in some sort of DDOS attack, which unfortunately brought down the entire server. The CPU rate hit 100% and just stayed there until the server was restarted. We had to disable the site itself as it kept brining it down.
What I am looking for is some way that we could prevent one site getting overloaded and killing the entire server. Is there a way to limit resources (RAM/CPU/Network etc) to each virtualhost? If not, can apache be prevented from taking more than 75% of resources for example? At least this way we can access the server to observe what is happening.
Using Apache 2.2.22, switching to another webserver isn't possible due to the business software only supporting this stack.
I'm fairly new to this (I have a separate role in the company) but I appreciate any direction on what to look into.
Thanks guys
[link][3 comments]