[PATCH (WIP)] limit CPU usage of cgit processes

Ferry Huberts mailings at hupie.com
Wed Mar 21 08:23:12 CET 2012


how about using a robots.txt on your site?

On 21-03-12 02:52, Eric Wong wrote:
> Here's a work-in-progress patch which I've been running to
> prevent crawlers/bots from using up all the CPU on my system
> when doing expensive queries.
>
> If it's interesting, it should be wired up to appropriate
> config option...
>
> Signed-off-by: Eric Wong<normalperson at yhbt.net>
> ---
>   cgit.c |   13 +++++++++++++
>   1 file changed, 13 insertions(+)
>
> diff --git a/cgit.c b/cgit.c
> index 1d50129..285467c 100644
> --- a/cgit.c
> +++ b/cgit.c
> @@ -768,12 +768,25 @@ static int calc_ttl()
>   	return ctx.cfg.cache_repo_ttl;
>   }
>
> +#include<sys/time.h>
> +#include<sys/resource.h>
> +static void init_rlimit(void)
> +{
> +	struct rlimit rlim = { .rlim_cur = 10, .rlim_max = 10 };
> +	if (setrlimit(RLIMIT_CPU,&rlim) != 0) {
> +		perror("setrlimit");
> +		exit(EXIT_FAILURE);
> +	}
> +}
> +
>   int main(int argc, const char **argv)
>   {
>   	const char *path;
>   	char *qry;
>   	int err, ttl;
>
> +	init_rlimit();
> +
>   	prepare_context(&ctx);
>   	cgit_repolist.length = 0;
>   	cgit_repolist.count = 0;

-- 
Ferry Huberts





More information about the CGit mailing list