• read json from file

    From Mortifis@VERT/EPHRAM to All on Sat Jun 20 11:24:37 2020
    I have a local json file that I'd like to search for a matching unknown record # without having to read in the entire object (the file has over 200,00 records spanning over 1.8 million lines) ... trying to read in the entire record errors with out of memory ...

    any ideas?

    Thanks,

    ~Mortifis

    ---
    ■ Synchronet ■ Realm of Dispair BBS - http://ephram.synchro.net:82
  • From echicken@VERT/ECBBS to Mortifis on Sat Jun 20 16:16:15 2020
    Re: read json from file
    By: Mortifis to All on Sat Jun 20 2020 11:24:37

    I have a local json file that I'd like to search for a matching unknown record
    # without having to read in the entire object (the file has over 200,00 records

    If your file looks like this:

    {"a":0,"b":1,"c":2,"d":3}

    You basically need to parse the whole thing. Or write your own special parser.

    If your file is like this:

    {"a":0,"b":1,"c":2,"d":3} {"e":4,"f":5,"g":6,"h":7}

    Then those are two separate JSON strings on adjacent lines. In which case you can read the file line by line, and each line will work with JSON.parse.

    ---
    echicken
    electronic chicken bbs - bbs.electronicchicken.com
    ■ Synchronet ■ electronic chicken bbs - bbs.electronicchicken.com
  • From Mortifis@VERT/EPHRAM to echicken on Sat Jun 20 21:43:56 2020
    Re: read json from file
    By: Mortifis to All on Sat Jun 20 2020 11:24:37

    I have a local json file that I'd like to search for a matching unknown record
    # without having to read in the entire object (the file has over 200,00 records

    If your file looks like this:

    {"a":0,"b":1,"c":2,"d":3}

    You basically need to parse the whole thing. Or write your own special parser.

    If your file is like this:

    {"a":0,"b":1,"c":2,"d":3}
    {"e":4,"f":5,"g":6,"h":7}

    Then those are two separate JSON strings on adjacent lines. In which case you can read the file line by line, and each line will work with JSON.parse.sadly ... it looks like this :

    [
    {
    "id": 707860,
    "name": "Hurzuf",
    "country": "UA",
    "coord": {
    "lon": 34.283333,
    "lat": 44.549999
    }
    },
    {
    "id": 519188,
    "name": "Novinki",
    "country": "RU",
    "coord": {
    "lon": 37.666668,
    "lat": 55.683334
    }
    },
    {
    "id": 1283378,
    "name": "Gorkhā",
    "country": "NP",
    "coord": {
    "lon": 84.633331,
    "lat": 28
    }
    }, .... that's the 1st three records of 209,578 records ... gotta read all 209,578 records (1 million, 8 hundred 86 thousand, 2 hundred 13 lines) ? ... bleh ... lol

    ---
    ■ Synchronet ■ Realm of Dispair BBS - http://ephram.synchro.net:82
  • From echicken@VERT/ECBBS to Mortifis on Sat Jun 20 22:57:44 2020
    Re: Re: read json from file
    By: Mortifis to echicken on Sat Jun 20 2020 21:43:56

    }, .... that's the 1st three records of 209,578 records ... gotta read all 209,578 records (1 million, 8 hundred 86 thousand, 2 hundred 13 lines) ? ...
    bleh ... lol

    Not necessarily, but you'll need something more than what we have on hand.

    Streaming JSON parsers for handling really large files are a thing, but I don't know if there's one that can be readily ported to our environment.

    JSON.parse() only wants to parse a complete JSON string. You'd need to be able to pre-process what you're reading from the file to be sure that JSON.parse() won't choke on it.

    That's tricky to do in a generic way that could handle any old JSON you throw at it.

    Easier if you do it as a custom job for this particular file, and if this file is just a flat array of objects, all with the same keys and types of values. It's either going to be a bit complicated but fairly solid, or simple and hacky and maybe not super reliable.

    ---
    echicken
    electronic chicken bbs - bbs.electronicchicken.com
    ■ Synchronet ■ electronic chicken bbs - bbs.electronicchicken.com
  • From Mortifis@VERT/EPHRAM to echicken on Sun Jun 21 12:32:56 2020
    Re: Re: read json from file
    By: Mortifis to echicken on Sat Jun 20 2020 21:43:56

    }, .... that's the 1st three records of 209,578 records ... gotta read all 209,578 records (1 million, 8 hundred 86 thousand, 2 hundred 13 lines) ? ...
    bleh ... lol

    Not necessarily, but you'll need something more than what we have on hand.

    Streaming JSON parsers for handling really large files are a thing, but I don't know if there's one that can be readily ported to our environment.

    JSON.parse() only wants to parse a complete JSON string. You'd need to be able to pre-process what you're reading from the file to be sure that JSON.parse() won't choke on it.

    That's tricky to do in a generic way that could handle any old JSON you throw at it.

    Easier if you do it as a custom job for this particular file, and if this file is just a flat array of objects, all with the same keys and types of values. It's either going to be a bit complicated but fairly solid, or simple and hacky and maybe not super reliable.

    This worked :

    load("sbbsdefs.js");
    var infile = js.exec_dir + "owm-citylist.json";

    write('Enter City Name: ');
    var what = readln().toUpperCase();

    writeln('\r\n\r\nSearching for '+what+'\r\n');

    var j = new File(infile);
    var json = "";
    var match = false;

    j.open("r");

    while(!j.eof && !match) {
    json = "";
    for(var i = 1; i < 10; i++) { // 9 lines per record { ... }
    json += j.readln();
    json = json.replace(/^\s+/g, '');
    }
    json = json.slice(0, -1); // strip trailing ',' from string
    var obj = JSON.parse(json);
    var n = obj['name'].toUpperCase().indexOf(what);
    if(n >= 0) {
    writeln('Name: '+ obj['name']+' Country:'+obj['country']+'\r\n\r\n');
    match = true;
    }
    }
    j.close();

    currently exits on 1st match ...

    Thank you for pointing me in the right direction (again :)

    ~Mortifis

    ---
    ■ Synchronet ■ Realm of Dispair BBS - http://ephram.synchro.net:82