Error downloading snapshots from remote sites

Roy Marples roy at
Mon May 4 15:29:19 CEST 2020

Hi Alex

On 04/05/2020 14:23, Alex Xu (Hello71) wrote:
> Excerpts from Roy Marples's message of May 4, 2020 5:45 am:
>> Hi
>> I only get this error downloading from a remote VM.
>> Another user has reported the error here:
>> Error is a sample error logged by nginx:
>> I use uwsgi to proxy the cgi to nginx.
>> *248 open() "/var/db/nginx/uwsgi_temp/6/00/0000000006" failed (13: Permission
>> denied) while reading upstream, client: 2001:470:a085:999::80, server:
>>, request: "GET /cgit/dhcpcd.git/snapshot/master.tar.xz
>> HTTP/1.1", upstream: "uwsgi://unix:/var/run/uwsgi/cgit.sock:", host:
>> ""
>> Otherwise this setup works fine.
>> The user uwsgi/cgit runs as owns the directory /var/db/nginx/uwsgi_temp.
>> Any pointers on how to fix this?
>> Thanks
>> Roy
> nginx buffers large responses to disk by default. most likely your whole
> uwsgi buffering is broken, but you only see it with snapshots because
> nginx decides that the response cannot fit into memory. in particular,
> /var/db/nginx needs to be owned by the nginx user (nginx/http/www-data),
> not the uwsgi user.
> this behavior can be adjusted with various tuning knobs:
> you may want
> to set uwsgi_buffering off. however, this prevents some useful features,
> including as I recall output compression. you can also try setting
> something like location ^ /snapshot/ { uwsgi_buffering off; }. that
> won't fix your temp_path issues though, which you will still have if
> someone requests some other large file.

Thanks for the reply.
It turned out the process needed to access ALL the folders in the path and one 
was locked out.
Discovered thanks to this post:



More information about the CGit mailing list