Limiting memory consumption of producer_consumer_buffer when downloading large files

Mar 17, 2016 at 2:47 PM

I am using http_client::request() to download large files (several GByte). On the returned http_response I am calling body() to get an istream. On this istream I am calling repeatedly read() to receive 1 MByte chunks of data to process the data on the fly (decrypt and calculating hash value) before writing it to disk.

When the processing is slower than the download the producer_consumer_buffer used internally allocates more and more memory.

Is there a way to limit the memory the producer_constumer_buffer allocates (and suspend the download until the processing catches up)?

Best regards,