ChatGPT解决这个技术问题 Extra ChatGPT

Using Node.JS, how do I read a JSON file into (server) memory?

Background

I am doing some experimentation with Node.js and would like to read a JSON object, either from a text file or a .js file (which is better??) into memory so that I can access that object quickly from code. I realize that there are things like Mongo, Alfred, etc out there, but that is not what I need right now.

Question

How do I read a JSON object out of a text or js file and into server memory using JavaScript/Node?


m
mihai

Sync:

var fs = require('fs');
var obj = JSON.parse(fs.readFileSync('file', 'utf8'));

Async:

var fs = require('fs');
var obj;
fs.readFile('file', 'utf8', function (err, data) {
  if (err) throw err;
  obj = JSON.parse(data);
});

I think JSON.parse is synchronous, its directly from v8, which means even with the Async way, people have to be careful with large JSON files. since it would tie up node.
For the sake of completeness. Their exists a npm called jsonfile.
I cant believe it was so difficult to find this simple thing. Every answer I got from google was doing an HTTPRequest or using JQuery or doing it in the browser
two points: (1) The synchronous answer should just be let imported = require("file.json"). (2) JSON.parse must be asynchronous, because I used this code to load a 70mb JSON file into memory as an object. It takes milliseconds this way, but if I use require(), it chugs.
For people finding this answer in 2019 and on, Node.js has had native json support for many, many versions through require, with this answer is no longer being applicable if you just want to load a json file. Just use let data = require('./yourjsonfile.json') and off you go (with the booknote that if the performance of require is impacting your code, you have problems well beyond "wanting to load a .json file")
T
Travis Tidwell

The easiest way I have found to do this is to just use require and the path to your JSON file.

For example, suppose you have the following JSON file.

test.json

{
  "firstName": "Joe",
  "lastName": "Smith"
}

You can then easily load this in your node.js application using require

var config = require('./test.json');
console.log(config.firstName + ' ' + config.lastName);

Just so folks know, and if I remember correctly, require in node runs synchronously. Dive in deep here
Another issue/benefit with such method is the fact that required data is cached unless you specifically delete the cached instance
"require" is meant to be used to load modules or config file you are using through out the lifespan of your application. does not seem right to use this to load files.
I'd say this is potentially a security threat. If the json file you're loading contains JS code, would requireing it run that code? If so then you really need to control where your json files are coming from or an attacker could run malicious code on your machine.
This is a sound solution for small DevOps scripts or batch operations. You have to balance human time with performance. As far as something you can commit to memory and use quickly for these appropriate cases, this is is tops. Not every task involves Big Data™ and hostile execution environments.
F
Florian Margaine

Asynchronous is there for a reason! Throws stone at @mihai

Otherwise, here is the code he used with the asynchronous version:

// Declare variables
var fs = require('fs'),
    obj

// Read the file and send to the callback
fs.readFile('path/to/file', handleFile)

// Write the callback function
function handleFile(err, data) {
    if (err) throw err
    obj = JSON.parse(data)
    // You can now play with your datas
}

agreed :), added async as well
Great :-) I don't like inline callbacks though, it can lead to callback nightmares that I'd rather avoid.
It's there for a reason.. unless you want it synchronously.
A
Alex Eftimiades

At least in Node v8.9.1, you can just do

var json_data = require('/path/to/local/file.json');

and access all the elements of the JSON object.


This approach loads file only once. If you will change the file.json after new require (without restarting program) data will be from first load. I do not have source to back this, but I had this in app I am building
Your answer is woefully incomplete. What that gets you is an object, and it doesn't even bother to implement tostring().
@DavidA.Gray The question wants to be able to access the objects as objects, not as strings. Asides from the singleton issue Lukas mentioned this answer is fine.
Using require will also execute arbitrary code in the file. This method is insecure and I would recommend against it.
F
Florian Ledermann

Answer for 2022, using ES6 module syntax and async/await

In modern JavaScript, this can be done as a one-liner, without the need to install additional packages:

import { readFile } from 'fs/promises';

let data = JSON.parse(await readFile("filename.json", "utf8"));

Add a try/catch block to handle exceptions as needed.


Where would you put the try catch?
I was looking for this, thank you! Works great when I know that the file's content is JSON data, but the extension is customized. The usual require('./jsonfile.xyz') cannot be used in this situation.
j
jpsecher

In Node 8 you can use the built-in util.promisify() to asynchronously read a file like this

const {promisify} = require('util')
const fs = require('fs')
const readFileAsync = promisify(fs.readFile)

readFileAsync(`${__dirname}/my.json`, {encoding: 'utf8'})
  .then(contents => {
    const obj = JSON.parse(contents)
    console.log(obj)
  })
  .catch(error => {
    throw error
  })

.readFile is already async, if you're looking for the sync version, its name is .readFileSync.
If you want to use promises, there's also fs/promises as of Node 10. Note: the API is experimental: nodejs.org/api/fs.html#fs_fs_promises_api
@Aternus .readFile is asynchronous, but not async. Meaning, the function is not defined with async keyword, nor does it return a Promise, so you can't do await fs.readFile('whatever.json');
@Kip how about a CodeSandBox?
A
Arturo Menchaca

Using fs-extra package is quite simple:

Sync:

const fs = require('fs-extra')

const packageObj = fs.readJsonSync('./package.json')
console.log(packageObj.version) 

Async:

const fs = require('fs-extra')

const packageObj = await fs.readJson('./package.json')
console.log(packageObj.version) 

i
ifelse.codes

using node-fs-extra (async await)

const readJsonFile = async () => {
    const myJsonObject = await fs.readJson('./my_json_file.json');
    console.log(myJsonObject);
}

readJsonFile() // prints your json object

x
xgqfrms

https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfile_file_options_callback

var fs = require('fs');  

fs.readFile('/etc/passwd', (err, data) => {
  if (err) throw err;
  console.log(data);
});  

// options
fs.readFile('/etc/passwd', 'utf8', callback);

https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfilesync_file_options

You can find all usage of Node.js at the File System docs! hope this help for you!


O
Oliver Salzburg
function parseIt(){
    return new Promise(function(res){
        try{
            var fs = require('fs');
            const dirPath = 'K:\\merge-xml-junit\\xml-results\\master.json';
            fs.readFile(dirPath,'utf8',function(err,data){
                if(err) throw err;
                res(data);
        })}
        catch(err){
            res(err);
        }
    });
}

async function test(){
    jsonData = await parseIt();
    var parsedJSON = JSON.parse(jsonData);
    var testSuite = parsedJSON['testsuites']['testsuite'];
    console.log(testSuite);
}

test();

J
Jehy

So many answers, and no one ever made a benchmark to compare sync vs async vs require. I described the difference in use cases of reading json in memory via require, readFileSync and readFile here.


H
Hitesh Sahu

If you are looking for a complete solution for Async loading a JSON file from Relative Path with Error Handling

  // Global variables
  // Request path module for relative path
    const path = require('path')
  // Request File System Module
   var fs = require('fs');


// GET request for the /list_user page.
router.get('/listUsers', function (req, res) {
   console.log("Got a GET request for list of users");

     // Create a relative path URL
    let reqPath = path.join(__dirname, '../mock/users.json');

    //Read JSON from relative path of this file
    fs.readFile(reqPath , 'utf8', function (err, data) {
        //Handle Error
       if(!err) {
         //Handle Success
          console.log("Success"+data);
         // Parse Data to JSON OR
          var jsonObj = JSON.parse(data)
         //Send back as Response
          res.end( data );
        }else {
           //Handle Error
           res.end("Error: "+err )
        }
   });
})

Directory Structure:

https://i.stack.imgur.com/Bz7jB.png