producer_consumer_buffer, large file and "multipart/form-data"

Mar 1, 2015 at 1:04 AM
hi,

Struggling to make it right. I found nice explanation how to implement simple scenario when file is loaded into memory upfront https://casablanca.codeplex.com/discussions/568637

but is there way to upload file incrementally using producer_consumer_buffer (critical for big files when loading all these MBs and GBs makes not sense)? Like some event to get new chunk of data.

Thank you!
Mar 2, 2015 at 11:39 AM
I had the same problem and found a solution by extending the sample code in https://casablanca.codeplex.com/SourceControl/latest#Release/samples/SearchFile/searchfile.cpp
I agree it is not the most readable structure but it works nicely and if the server support it, you can have more upload task running.
Marked as answer by roschuma on 3/2/2015 at 10:50 AM
Coordinator
Mar 3, 2015 at 12:13 AM
Hi denveloper,

There are a couple of different options. You could write to the producer_consumer_buffer in chunks. If you data is coming from a file the best thing probably would be to use a file buffer/file_stream. The steps basically consist of calling the factory function on the file_stream class, file_stream<uint8_t>::open_istream(filename). This function is asynchronous and returns a task of an input stream that you can then pass as the body of and http_request using the http_request::set_body(stream) method and set the Content-Type. Then what will happen is the stream will be read from in chunks, 64KB by default but configurable with http_client_config::set_chunksize, and sent out. The BingRequest sample uses a file buffer to save the HTTP response, instead for your case instead you will want to use the file buffer as the request body.

Steve