Possible bug in json parser (v1.4.0)

Jan 20, 2014 at 6:33 PM
Hi folks,
Great work - thanks for the XP support :)

I was just diagnosing a difficult bug today (of the type that only appear on some PCs...). I narrowed it down to the json parser - see below.

The code at the bottom demonstrates behavior that hints to a bug in the json parser (possibly something to do with threading as the results aren't 100% consistent). The problem appears when values in a nested object contain backslashes. When parsing from a file, characters around the backslashes are 'eaten up' by the parser.

As I mentioned the results aren't 100% consistent (I have yet to observe the error on debug builds) but if you compile in release mode and play around with NUM_ITERATIONS I am confident you will observe the same.

Let me know if you need more info...


An example of what I last got:

The code:
#include <cpprest/json.h>
#include <string>
#include <fstream>
#include <iostream>

using namespace std;
using namespace web;

const wstring INPUT_FILE = L"input_file.dat";
const wstring OUTPUT_FILE = L"output_file.dat";
const wstring JSON_INPUT_FILE = L"json_input_file.json";
const wstring JSON_OUTPUT_FILE = L"json_output_file.json";
const wstring CONTAINED_DATA_ID = L"conained_data";
const int NUM_ITERATIONS = 1;

void json_to_file(json::value & val, const std::wstring & file_name);

int _tmain(int argc, _TCHAR* argv[])
    auto data_map = json::value::object();

    // Create a bunch of entries with backslashes
        const wchar_t FORMAT1[] = L"%d";
        wchar_t buffer1[1024] = { 0 };
        const wchar_t FORMAT2[] = L"c:\\dir1_%d\\dir1_%d\\file_%d.data";
        wchar_t buffer2[2048] = { 0 };

        for (int i = 0; i < NUM_ITERATIONS; ++i)
            swprintf_s(buffer1, FORMAT1, i);
            swprintf_s(buffer2, FORMAT2, i, i + 1, i + 2);
            data_map[buffer1] = json::value::string(buffer2);

    auto container = json::value::object();
    container[CONTAINED_DATA_ID] = data_map;

    { // Save to file
        wofstream output_file(JSON_INPUT_FILE);
        if (!output_file)
            return 1;

        output_file << container;
    json_to_file(container, INPUT_FILE);

        wifstream  input_file(JSON_INPUT_FILE);
        if (!input_file)
            return 1;

        json::value input_val;
        input_file >> input_val;

        { // Save back to json file
            wofstream json_output_file(JSON_OUTPUT_FILE);
            if (json_output_file)
                json_output_file << input_val;

        json_to_file(input_val, OUTPUT_FILE);

    return 0;

void json_to_file(json::value & val, const std::wstring & file_name)
    wofstream output_file(file_name);
    if (!output_file)
        throw std::runtime_error("could not open file.");

    auto contained = val[CONTAINED_DATA_ID];

    for (auto value : contained)
        output_file << L".as_string(): " << value.first.as_string() << L"=[" << value.second.as_string() << L"]\n";
        output_file << L".to_string(): " << value.first.as_string() << L"=[" << value.second.to_string() << L"]\n\n";
Jan 28, 2014 at 8:05 PM
Hi yiannis_ilydauk,

I can't reproduce your issue. Can you please provide us with the following info:
  • what line exactly throws or has problematic behavior?
  • what OS you are using Casablanca on?
  • what type of PCs you are referring to?
Feb 4, 2014 at 1:35 PM
Hi Oggy,
Did you try a release build (the difference in behavior compared to debug builds might be attributed to the extra data that's allocated in json objects for use by the visualizer, for example).

I initially observed the behavior on an old XP PC (x86 running in a VM). Since we were doing static linking which is currently not supported, I also replicated on a Win8.1 PC (x64 multicore machine, I did a 32 bit compilation) using the nugget package on a clean project.

Does the parser use parallelism?

I have worked around the problem on my end so it is not something urgent for me, but I thought I' d report it anyway.

Feb 4, 2014 at 5:29 PM
Hi yiannis_ilydauk,

We'll try with different flavors/architectures and let you know.

Our parser is not multi-threaded. It uses a single thread to parse json.

Good that you found a workaround for your problem.

Thanks for reporting it!