JSON.stringify() and JSON.parse(): The Complete Developer Guide for 2026
JSON.stringify() and JSON.parse() are two of the most frequently used methods in JavaScript. They appear in virtually every web application, API integration, configuration system, and data pipeline. Despite their apparent simplicity, both methods contain subtle behaviors, powerful parameters, and edge cases that even experienced developers overlook.
This guide covers everything you need to know about these two methods: their full API surface, every edge case that can trip you up, practical patterns for real-world development, performance considerations for large datasets, and security pitfalls that can compromise your application. Whether you are building a REST API, storing data in localStorage, or debugging a serialization bug at 2 AM, this is the reference you need.
What JSON.stringify() Does and How It Works
JSON.stringify() converts a JavaScript value into a JSON-formatted string. The method walks the object graph recursively, converting each value into its JSON representation according to a well-defined set of rules.
The full signature is:
JSON.stringify(value, replacer?, space?)
At its simplest, you pass a value and get a string back:
const user = {
name: "Alice",
age: 30,
roles: ["admin", "editor"],
active: true
};
const json = JSON.stringify(user);
// '{"name":"Alice","age":30,"roles":["admin","editor"],"active":true}'
The output is a compact string with no whitespace. Every key is double-quoted (JSON requires this, even though JavaScript does not). Strings use double quotes, numbers are unquoted, booleans are lowercase true/false, arrays use square brackets, and objects use curly braces.
The method handles nested structures of arbitrary depth:
const config = {
server: {
host: "0.0.0.0",
port: 8080,
tls: {
cert: "/etc/ssl/cert.pem",
key: "/etc/ssl/key.pem",
protocols: ["TLSv1.2", "TLSv1.3"]
}
},
database: {
primary: { host: "db1.internal", port: 5432 },
replica: { host: "db2.internal", port: 5432 }
}
};
console.log(JSON.stringify(config));
// Produces a single-line JSON string with all nesting preserved
Primitive values are also valid inputs. JSON.stringify() does not require an object or array:
JSON.stringify("hello"); // '"hello"' (note the quotes in the output)
JSON.stringify(42); // '42'
JSON.stringify(true); // 'true'
JSON.stringify(null); // 'null'
The Replacer Parameter: Filtering and Transforming Output
The second parameter to JSON.stringify() is the replacer. It controls which properties appear in the output and how values are transformed. It accepts two forms: a function or an array of strings.
Replacer as a Function
When you pass a function, it is called for every key-value pair in the object. The function receives the key and value, and its return value determines what appears in the output:
const user = {
name: "Alice",
email: "alice@example.com",
password: "s3cret!",
age: 30,
lastLogin: new Date("2026-02-10T14:30:00Z")
};
// Remove sensitive fields, format dates
const json = JSON.stringify(user, (key, value) => {
if (key === "password") return undefined; // omit from output
if (key === "email") return "***REDACTED***";
return value; // keep everything else
});
console.log(json);
// '{"name":"Alice","email":"***REDACTED***","age":30,"lastLogin":"2026-02-10T14:30:00.000Z"}'
Key behaviors of the replacer function:
- Returning
undefinedomits the property entirely from the output. - The function is called with an empty string key (
"") for the root object itself. This is the first call and lets you transform or replace the entire value before serialization begins. - The
thiscontext inside the replacer is the object that holds the current key. For the root call, it is{"": value}. - The function is called recursively for nested objects and arrays.
Here is a more advanced example that handles nested objects and adds type annotations:
// Log-friendly serializer that adds type info for debugging
function debugReplacer(key: string, value: unknown): unknown {
if (value instanceof Map) {
return { __type: "Map", entries: [...value.entries()] };
}
if (value instanceof Set) {
return { __type: "Set", values: [...value.values()] };
}
if (value instanceof RegExp) {
return { __type: "RegExp", source: value.source, flags: value.flags };
}
if (typeof value === "bigint") {
return { __type: "BigInt", value: value.toString() };
}
return value;
}
const data = {
users: new Map([["alice", { role: "admin" }], ["bob", { role: "viewer" }]]),
tags: new Set(["javascript", "typescript", "node"]),
pattern: /^[a-z]+$/gi,
bigId: 9007199254740993n
};
console.log(JSON.stringify(data, debugReplacer, 2));
Replacer as an Array (Allowlist)
When you pass an array of strings, only those properties are included in the output. This acts as a whitelist:
const user = {
id: 1,
name: "Alice",
email: "alice@example.com",
password: "hashed_value",
createdAt: "2026-01-15",
internalNotes: "VIP customer"
};
// Only include safe fields in API response
const safeJson = JSON.stringify(user, ["id", "name", "email", "createdAt"]);
// '{"id":1,"name":"Alice","email":"alice@example.com","createdAt":"2026-01-15"}'
The array replacer only works for object properties, not array elements. It applies at every level of nesting, so nested objects will also be filtered to only include the listed keys.
The Space Parameter: Pretty-Printing JSON
The third parameter controls indentation and formatting. It accepts a number (spaces per level) or a string (used as the indent character):
const data = { name: "Alice", scores: [95, 87, 92], active: true };
// 2-space indentation (most common)
console.log(JSON.stringify(data, null, 2));
// {
// "name": "Alice",
// "scores": [
// 95,
// 87,
// 92
// ],
// "active": true
// }
// 4-space indentation
console.log(JSON.stringify(data, null, 4));
// Tab indentation
console.log(JSON.stringify(data, null, "\t"));
// Custom prefix (up to 10 characters)
console.log(JSON.stringify(data, null, "---"));
// {
// ---"name": "Alice",
// ---"scores": [
// ------95,
// ------87,
// ------92
// ---],
// ---"active": true
// }
The space value is clamped to a maximum of 10. Passing JSON.stringify(data, null, 20) produces the same result as JSON.stringify(data, null, 10). String values longer than 10 characters are truncated.
Passing null or 0 for space produces compact output with no whitespace. This is what you want for data transfer (API responses, storage). Pretty-printed output is for human consumption: logging, debugging, configuration files.
The toJSON() Method: Custom Serialization
If an object has a toJSON() method, JSON.stringify() calls it and serializes the return value instead of the object itself. This is how you control exactly what an object looks like when serialized.
class DateRange {
constructor(start, end) {
this.start = start;
this.end = end;
}
toJSON() {
return {
start: this.start.toISOString().split("T")[0],
end: this.end.toISOString().split("T")[0],
days: Math.ceil((this.end - this.start) / 86400000)
};
}
}
const range = new DateRange(
new Date("2026-02-01"),
new Date("2026-02-28")
);
console.log(JSON.stringify({ booking: range }, null, 2));
// {
// "booking": {
// "start": "2026-02-01",
// "end": "2026-02-28",
// "days": 27
// }
// }
The built-in Date object already has a toJSON() method that calls toISOString(). That is why dates serialize as ISO strings rather than the internal object structure:
const now = new Date();
console.log(JSON.stringify(now));
// '"2026-02-11T10:30:00.000Z"'
// Date.prototype.toJSON is defined as:
// Date.prototype.toJSON = function() { return this.toISOString(); }
The toJSON() method takes precedence over the replacer function. The replacer sees the value returned by toJSON(), not the original object. This ordering matters when you are combining both features.
TypeScript Interface for toJSON
interface Serializable {
toJSON(): unknown;
}
class ApiResponse implements Serializable {
constructor(
public data: unknown,
public statusCode: number,
private internalError?: Error
) {}
toJSON() {
return {
data: this.data,
status: this.statusCode,
// internalError is intentionally excluded from serialization
timestamp: new Date().toISOString()
};
}
}
Edge Cases: What JSON.stringify() Cannot Serialize
JSON is a strict subset of JavaScript values. Many JavaScript types have no JSON representation. Understanding how JSON.stringify() handles these cases is essential for avoiding silent data loss.
undefined, Functions, and Symbols
const obj = {
name: "Alice",
callback: function() { return true; },
id: Symbol("user"),
nickname: undefined,
age: 30
};
console.log(JSON.stringify(obj));
// '{"name":"Alice","age":30}'
// callback, id, and nickname are silently dropped!
When these values appear as object properties, they are omitted entirely. When they appear as array elements, they are replaced with null:
const arr = [1, undefined, function() {}, Symbol("x"), 4];
console.log(JSON.stringify(arr));
// '[1,null,null,null,4]'
When these values are the root value, JSON.stringify() returns undefined (not the string "undefined"):
JSON.stringify(undefined); // undefined
JSON.stringify(function() {}); // undefined
JSON.stringify(Symbol("test")); // undefined
BigInt Throws an Error
const data = { id: 9007199254740993n };
try {
JSON.stringify(data);
} catch (e) {
console.error(e.message);
// "Do not know how to serialize a BigInt"
}
// Solution 1: Add toJSON to BigInt prototype (not recommended for libraries)
BigInt.prototype.toJSON = function() {
return this.toString();
};
// Solution 2: Use a replacer function (safer)
JSON.stringify(data, (key, value) =>
typeof value === "bigint" ? value.toString() : value
);
// '{"id":"9007199254740993"}'
This is the only standard JavaScript type that causes JSON.stringify() to throw. All other unsupported types are silently dropped or converted to null.
Circular References Throw an Error
const a = { name: "parent" };
const b = { name: "child", parent: a };
a.child = b; // circular reference
try {
JSON.stringify(a);
} catch (e) {
console.error(e.message);
// "Converting circular structure to JSON"
}
// Solution: track seen objects
function safeStringify(obj, indent = 2) {
const seen = new WeakSet();
return JSON.stringify(obj, (key, value) => {
if (typeof value === "object" && value !== null) {
if (seen.has(value)) return "[Circular]";
seen.add(value);
}
return value;
}, indent);
}
console.log(safeStringify(a));
// {
// "name": "parent",
// "child": {
// "name": "child",
// "parent": "[Circular]"
// }
// }
Date Objects
const event = {
title: "Launch",
date: new Date("2026-03-15T09:00:00Z")
};
const json = JSON.stringify(event);
// '{"title":"Launch","date":"2026-03-15T09:00:00.000Z"}'
// When you parse it back, the date is a STRING, not a Date object
const parsed = JSON.parse(json);
console.log(typeof parsed.date); // "string"
console.log(parsed.date instanceof Date); // false
// You need a reviver to restore Date objects (covered in the JSON.parse section)
RegExp, Map, Set, WeakMap, WeakSet
// RegExp serializes as an empty object
JSON.stringify(/[a-z]+/gi); // '{}'
JSON.stringify({ pattern: /\d+/ }); // '{"pattern":{}}'
// Map and Set serialize as empty objects
JSON.stringify(new Map([["a", 1]])); // '{}'
JSON.stringify(new Set([1, 2, 3])); // '{}'
// To preserve these types, use a replacer:
function advancedReplacer(key, value) {
if (value instanceof Map) {
return { __type: "Map", data: [...value.entries()] };
}
if (value instanceof Set) {
return { __type: "Set", data: [...value.values()] };
}
if (value instanceof RegExp) {
return { __type: "RegExp", source: value.source, flags: value.flags };
}
return value;
}
NaN and Infinity
JSON.stringify(NaN); // 'null'
JSON.stringify(Infinity); // 'null'
JSON.stringify(-Infinity); // 'null'
const data = { score: NaN, limit: Infinity };
JSON.stringify(data);
// '{"score":null,"limit":null}'
This is a common source of silent data corruption. If a calculation produces NaN or Infinity, serializing and deserializing will turn it into null, not the original value.
Sparse Arrays
const sparse = [1, , , 4]; // holes at index 1 and 2
console.log(JSON.stringify(sparse));
// '[1,null,null,4]' — holes become null
Complete Edge Case Reference Table
| Input Value | As Object Property | As Array Element | As Root Value |
|---|---|---|---|
undefined |
Omitted | null | undefined (no output) |
function |
Omitted | null | undefined (no output) |
Symbol |
Omitted | null | undefined (no output) |
BigInt |
Throws TypeError | Throws TypeError | Throws TypeError |
NaN / Infinity |
null | null | 'null' |
Date |
ISO string (via toJSON) | ISO string | ISO string |
RegExp |
{} (empty object) | {} | '{}' |
Map / Set |
{} (empty object) | {} | '{}' |
| Circular ref | Throws TypeError | Throws TypeError | Throws TypeError |
JSON.parse(): Converting Strings Back to Values
JSON.parse() takes a JSON-formatted string and converts it into a JavaScript value. The full signature is:
JSON.parse(text, reviver?)
Basic usage is straightforward:
const json = '{"name":"Alice","age":30,"roles":["admin","editor"]}';
const user = JSON.parse(json);
console.log(user.name); // "Alice"
console.log(user.age); // 30
console.log(user.roles[0]); // "admin"
JSON.parse() is strict about input format. It rejects several things that JavaScript allows:
// These all throw SyntaxError:
JSON.parse("{'name': 'Alice'}"); // Single quotes not allowed
JSON.parse('{name: "Alice"}'); // Unquoted keys not allowed
JSON.parse('{"name": "Alice",}'); // Trailing comma not allowed
JSON.parse('undefined'); // Not valid JSON
JSON.parse(''); // Empty string not valid
JSON.parse('// comment\n{}'); // Comments not allowed in JSON
// These work:
JSON.parse('null'); // null
JSON.parse('true'); // true
JSON.parse('42'); // 42
JSON.parse('"hello"'); // "hello"
The Reviver Function
The reviver is the counterpart to the replacer. It receives each key-value pair after parsing and can transform or filter values. The most common use case is restoring types that JSON cannot represent natively:
// Restore Date objects from ISO strings
const json = '{"title":"Launch","date":"2026-03-15T09:00:00.000Z","count":42}';
const event = JSON.parse(json, (key, value) => {
// Check if value looks like an ISO date string
if (typeof value === "string" && /^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}/.test(value)) {
const date = new Date(value);
if (!isNaN(date.getTime())) return date;
}
return value;
});
console.log(event.date instanceof Date); // true
console.log(event.date.getFullYear()); // 2026
console.log(event.count); // 42 (unchanged)
The reviver walks the parsed value bottom-up. Nested objects are revived before their parents. The last call has an empty string key for the root value:
JSON.parse('{"a":{"b":1},"c":2}', (key, value) => {
console.log(key, value);
return value;
});
// Logs:
// "b" 1
// "a" {b: 1}
// "c" 2
// "" {a: {b: 1}, c: 2}
Like the replacer, returning undefined from the reviver removes the property:
// Strip internal fields that start with underscore
const data = '{"name":"Alice","_internal":"debug","_timestamp":12345,"role":"admin"}';
const cleaned = JSON.parse(data, (key, value) => {
if (key.startsWith("_")) return undefined; // remove
return value;
});
console.log(cleaned);
// { name: "Alice", role: "admin" }
Advanced Reviver: Restoring Custom Types
// Full round-trip with custom types
function advancedReviver(key, value) {
if (value && typeof value === "object" && value.__type) {
switch (value.__type) {
case "Map":
return new Map(value.data);
case "Set":
return new Set(value.data);
case "RegExp":
return new RegExp(value.source, value.flags);
case "BigInt":
return BigInt(value.value);
case "Date":
return new Date(value.iso);
}
}
return value;
}
// This pairs with the advancedReplacer from earlier
const original = {
users: new Map([["alice", "admin"]]),
tags: new Set(["js", "ts"]),
pattern: /^\d+$/g
};
const serialized = JSON.stringify(original, advancedReplacer);
const restored = JSON.parse(serialized, advancedReviver);
console.log(restored.users instanceof Map); // true
console.log(restored.tags instanceof Set); // true
console.log(restored.pattern instanceof RegExp); // true
Common Patterns: Deep Clone, Serialization, and Storage
Deep Cloning Objects
For years, the JSON round-trip was the standard way to deep-clone objects in JavaScript:
const original = {
name: "Alice",
address: {
city: "Portland",
state: "OR",
coordinates: { lat: 45.5152, lng: -122.6784 }
},
tags: ["developer", "speaker"]
};
// Classic deep clone
const clone = JSON.parse(JSON.stringify(original));
// Modifications to the clone do not affect the original
clone.address.city = "Seattle";
clone.tags.push("author");
console.log(original.address.city); // "Portland" (unchanged)
console.log(original.tags.length); // 2 (unchanged)
This works but has important limitations: it drops undefined, functions, and symbols; it converts Dates to strings; it throws on circular references; and it loses prototype chains. For simple data objects (API responses, configuration), it works well. For complex objects with methods or special types, use structuredClone() instead (covered below).
localStorage and sessionStorage
Web Storage APIs only store strings. JSON serialization is the standard approach for storing structured data:
// Save complex data to localStorage
function saveSettings(settings) {
try {
const json = JSON.stringify(settings);
localStorage.setItem("app-settings", json);
return true;
} catch (e) {
if (e instanceof DOMException && e.name === "QuotaExceededError") {
console.error("localStorage is full");
}
return false;
}
}
// Load with defaults and type restoration
function loadSettings(defaults) {
try {
const json = localStorage.getItem("app-settings");
if (!json) return defaults;
const stored = JSON.parse(json, (key, value) => {
// Restore Date objects
if (key === "lastSaved" || key === "createdAt") {
return new Date(value);
}
return value;
});
// Merge with defaults to handle new settings added after storage
return { ...defaults, ...stored };
} catch (e) {
console.error("Failed to load settings:", e);
return defaults;
}
}
// Usage
saveSettings({
theme: "dark",
fontSize: 14,
lastSaved: new Date(),
recentFiles: ["/src/app.ts", "/src/utils.ts"]
});
const settings = loadSettings({
theme: "light",
fontSize: 16,
lastSaved: null,
recentFiles: [],
showLineNumbers: true // new setting not in stored data
});
API Request and Response Handling
// TypeScript: type-safe API response handling
interface ApiResponse<T> {
data: T;
meta: {
page: number;
total: number;
timestamp: string;
};
}
interface User {
id: number;
name: string;
createdAt: Date;
}
async function fetchUsers(page: number): Promise<ApiResponse<User[]>> {
const response = await fetch(`/api/users?page=${page}`);
const text = await response.text();
// Parse with date restoration
return JSON.parse(text, (key, value) => {
if (key === "createdAt" || key === "timestamp") {
return new Date(value);
}
return value;
});
}
// Sending data with proper serialization
async function createUser(user: Omit<User, "id">): Promise<User> {
const response = await fetch("/api/users", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(user) // Date auto-converts via toJSON()
});
return response.json();
}
Configuration File Handling in Node.js
import { readFileSync, writeFileSync } from "fs";
// Read and parse config with error handling
function loadConfig(path: string) {
try {
const text = readFileSync(path, "utf-8");
return JSON.parse(text);
} catch (e) {
if (e.code === "ENOENT") {
console.log(`Config not found at ${path}, using defaults`);
return {};
}
if (e instanceof SyntaxError) {
console.error(`Invalid JSON in ${path}: ${e.message}`);
process.exit(1);
}
throw e;
}
}
// Write config with pretty printing
function saveConfig(path: string, config: Record<string, unknown>) {
const json = JSON.stringify(config, null, 2) + "\n"; // trailing newline for POSIX
writeFileSync(path, json, "utf-8");
}
Comparing Objects for Equality
// Quick structural equality check (order-dependent)
function shallowEqual(a, b) {
return JSON.stringify(a) === JSON.stringify(b);
}
// Order-independent comparison using sorted keys
function deepEqual(a, b) {
const sortedStringify = (obj) =>
JSON.stringify(obj, Object.keys(obj).sort());
return sortedStringify(a) === sortedStringify(b);
}
// Note: this only works for JSON-serializable values.
// It fails for undefined, functions, symbols, and has
// the same limitations as any JSON round-trip.
Performance Tips for Large JSON
When working with large datasets (megabytes or gigabytes of JSON), serialization and parsing performance becomes a real concern.
Measuring Baseline Performance
// Generate a large dataset for benchmarking
const largeArray = Array.from({ length: 100_000 }, (_, i) => ({
id: i,
name: `User ${i}`,
email: `user${i}@example.com`,
scores: [Math.random() * 100, Math.random() * 100],
metadata: { created: new Date().toISOString(), active: i % 3 !== 0 }
}));
console.time("stringify");
const json = JSON.stringify(largeArray);
console.timeEnd("stringify");
// Typically 100-300ms for 100K objects in V8
console.time("parse");
const parsed = JSON.parse(json);
console.timeEnd("parse");
// Typically 100-200ms for the same data
console.log(`JSON size: ${(json.length / 1024 / 1024).toFixed(2)} MB`);
// ~30-50 MB depending on data
Avoid Pretty-Printing for Data Transfer
const data = { /* large object */ };
// Compact: fast and small
const compact = JSON.stringify(data);
// Pretty: 20-40% larger, slightly slower to produce
const pretty = JSON.stringify(data, null, 2);
// Only use pretty-printing for human-readable output (logs, debug).
// API responses, file storage, and message queues should use compact JSON.
Replacer Function Overhead
// The replacer function is called for EVERY key-value pair.
// For 100K objects with 6 keys each, that's 600K+ function calls.
// Slow: complex replacer
const slow = JSON.stringify(data, (key, value) => {
if (typeof value === "string") return value.trim();
if (value instanceof Date) return value.toISOString();
return value;
});
// Faster: pre-process your data, then stringify without a replacer
const preprocessed = data.map(item => ({
...item,
name: item.name.trim(),
email: item.email.trim()
}));
const fast = JSON.stringify(preprocessed);
Streaming JSON for Very Large Data
// For datasets that don't fit in memory, use streaming parsers
// Node.js: stream-json for parsing
import { parser } from "stream-json";
import { streamArray } from "stream-json/streamers/StreamArray";
import { createReadStream } from "fs";
const pipeline = createReadStream("huge-file.json")
.pipe(parser())
.pipe(streamArray());
pipeline.on("data", ({ value }) => {
// Process each array element individually
// without loading the entire file into memory
processRecord(value);
});
// For writing large JSON, build it incrementally:
import { createWriteStream } from "fs";
const output = createWriteStream("output.json");
output.write("[\n");
let first = true;
for (const record of generateRecords()) {
if (!first) output.write(",\n");
output.write(JSON.stringify(record));
first = false;
}
output.write("\n]");
output.end();
Worker Threads for Non-Blocking Parsing
// In the browser, large JSON operations block the main thread.
// Use a Web Worker to parse in the background:
// worker.js
self.onmessage = (event) => {
try {
const result = JSON.parse(event.data);
self.postMessage({ success: true, data: result });
} catch (e) {
self.postMessage({ success: false, error: e.message });
}
};
// main.js
function parseAsync(jsonString) {
return new Promise((resolve, reject) => {
const worker = new Worker("worker.js");
worker.onmessage = (event) => {
worker.terminate();
if (event.data.success) resolve(event.data.data);
else reject(new Error(event.data.error));
};
worker.postMessage(jsonString);
});
}
// Usage
const data = await parseAsync(hugeJsonString);
Security Considerations
JSON serialization and parsing introduce several security risks. Understanding these is critical for any application that handles user-provided data.
Prototype Pollution via JSON.parse()
// Attacker sends this JSON payload:
const malicious = '{"__proto__": {"isAdmin": true}}';
// Naive parsing and property spreading
const userInput = JSON.parse(malicious);
const config = {};
// Using Object.assign or spread with parsed data:
Object.assign(config, userInput);
// Now EVERY object inherits isAdmin: true
const user = {};
console.log(user.isAdmin); // true! Prototype pollution!
// DEFENSE 1: Use Object.create(null) for parsed data targets
const safeConfig = Object.create(null);
Object.assign(safeConfig, JSON.parse(malicious));
// safeConfig has no prototype, so __proto__ is just a regular property
// DEFENSE 2: Filter dangerous keys in a reviver
function safeReviver(key, value) {
if (key === "__proto__" || key === "constructor" || key === "prototype") {
return undefined; // strip dangerous keys
}
return value;
}
const safe = JSON.parse(malicious, safeReviver);
// { } — __proto__ key was stripped
// DEFENSE 3: Use a validation library (zod, joi, ajv)
import { z } from "zod";
const UserSchema = z.object({
name: z.string().max(100),
email: z.string().email(),
role: z.enum(["viewer", "editor", "admin"])
});
const validated = UserSchema.parse(JSON.parse(userInput));
// Throws if the data doesn't match the schema
XSS via JSON Embedded in HTML
// DANGEROUS: Embedding user-controlled JSON in a script tag
const userData = { name: "Alice</script><script>alert('XSS')</script>" };
// This creates an XSS vulnerability:
// <script>const data = {"name":"Alice</script><script>alert('XSS')</script>"}</script>
// DEFENSE: Escape special characters in JSON embedded in HTML
function safeJsonForHtml(data) {
return JSON.stringify(data)
.replace(/</g, "\\u003c")
.replace(/>/g, "\\u003e")
.replace(/&/g, "\\u0026")
.replace(/'/g, "\\u0027");
}
// Safe output:
// <script>const data = {"name":"Alice\\u003c/script\\u003e\\u003cscript\\u003ealert('XSS')\\u003c/script\\u003e"}</script>
Denial of Service via Large or Deeply Nested JSON
// Attacker sends deeply nested JSON to crash the parser:
// {"a":{"a":{"a":{"a":{ ... 10000 levels deep ... }}}}}
// DEFENSE: Limit input size before parsing
function safeParse(input, maxSize = 1_000_000) {
if (typeof input !== "string") {
throw new TypeError("Input must be a string");
}
if (input.length > maxSize) {
throw new RangeError(`JSON input exceeds ${maxSize} bytes`);
}
return JSON.parse(input);
}
// In Express.js, use the built-in body parser limit:
app.use(express.json({ limit: "1mb" }));
Number Precision Loss
// JSON numbers that exceed Number.MAX_SAFE_INTEGER lose precision
const json = '{"id": 9007199254740993}';
const parsed = JSON.parse(json);
console.log(parsed.id); // 9007199254740992 (WRONG! Off by 1)
// DEFENSE 1: Use string IDs in your API
// { "id": "9007199254740993" }
// DEFENSE 2: Parse large numbers as BigInt with a reviver
const safeParsed = JSON.parse(json, (key, value, context) => {
// context.source is available in modern JS engines (ES2024)
if (key === "id" && typeof value === "number") {
// Use the raw source string to construct a BigInt
return BigInt(context.source);
}
return value;
});
// DEFENSE 3: Use a library like json-bigint
import JSONbig from "json-bigint";
const result = JSONbig.parse(json);
// result.id is a BigInt: 9007199254740993n
Alternatives to JSON.stringify() and JSON.parse()
structuredClone() for Deep Cloning
Available in all modern browsers and Node.js 17+, structuredClone() is the modern replacement for the JSON round-trip clone pattern:
const original = {
name: "Alice",
date: new Date("2026-03-15"),
data: new Map([["key", "value"]]),
buffer: new ArrayBuffer(8),
pattern: /test/gi,
nested: { deep: { value: 42 } }
};
// structuredClone preserves these types!
const clone = structuredClone(original);
console.log(clone.date instanceof Date); // true
console.log(clone.data instanceof Map); // true
console.log(clone.pattern instanceof RegExp); // true
// It also handles circular references without throwing:
const circular = { name: "self" };
circular.self = circular;
const cloned = structuredClone(circular); // Works!
console.log(cloned.self === cloned); // true
Comparison of cloning approaches:
| Feature | JSON round-trip | structuredClone() |
|---|---|---|
| Date objects | Converted to strings | Preserved |
| Map / Set | Lost (empty object) | Preserved |
| RegExp | Lost (empty object) | Preserved |
| ArrayBuffer / TypedArray | Lost | Preserved |
| Circular references | Throws error | Handled |
| undefined values | Dropped / null | Preserved |
| Functions | Dropped | Throws error |
| Class instances | Plain objects | Plain objects |
| Produces a string | Yes (intermediate step) | No (direct clone) |
Use structuredClone() when you need a deep copy. Use JSON.stringify() / JSON.parse() when you need a JSON string (for storage, network transfer, or logging).
Custom Serializers: superjson, devalue, and Others
// superjson: preserves types across serialization
import superjson from "superjson";
const data = {
date: new Date(),
set: new Set([1, 2, 3]),
map: new Map([["a", 1]]),
bigint: 9007199254740993n,
regexp: /test/gi,
undef: undefined
};
const { json, meta } = superjson.serialize(data);
// json contains the data, meta contains type information
// Both are JSON-serializable
const restored = superjson.deserialize({ json, meta });
// All types perfectly restored!
console.log(restored.date instanceof Date); // true
console.log(restored.set instanceof Set); // true
console.log(typeof restored.bigint); // "bigint"
// devalue: Svelte's serializer, handles cycles and more
import { stringify, parse } from "devalue";
const circular = { name: "self" };
circular.ref = circular;
const serialized = stringify(circular); // Works!
const restored = parse(serialized);
console.log(restored.ref === restored); // true
MessagePack and CBOR: Binary Alternatives
// msgpack: binary-serialized JSON-like format, 20-50% smaller
import { encode, decode } from "@msgpack/msgpack";
const data = { name: "Alice", scores: [95, 87, 92], active: true };
const packed = encode(data); // Uint8Array, compact binary
const unpacked = decode(packed); // original data restored
console.log(JSON.stringify(data).length); // ~60 bytes
console.log(packed.length); // ~40 bytes (33% smaller)
Practical Patterns and Recipes
Safe JSON.parse() Wrapper
// Production-grade safe parse with TypeScript
function tryParse<T>(json: string, fallback: T): T;
function tryParse<T>(json: string): T | undefined;
function tryParse<T>(json: string, fallback?: T): T | undefined {
try {
return JSON.parse(json) as T;
} catch {
return fallback;
}
}
// Usage
const config = tryParse<Config>(rawJson, defaultConfig);
const data = tryParse<User[]>(response);
if (data === undefined) {
console.error("Failed to parse response");
}
Deterministic JSON (Sorted Keys)
// JSON.stringify does NOT guarantee key order (though V8 follows insertion order).
// For caching, hashing, or diffing, you need deterministic output:
function deterministicStringify(obj, space) {
return JSON.stringify(obj, (key, value) => {
if (value && typeof value === "object" && !Array.isArray(value)) {
return Object.keys(value).sort().reduce((sorted, k) => {
sorted[k] = value[k];
return sorted;
}, {});
}
return value;
}, space);
}
const a = { z: 1, a: 2, m: 3 };
const b = { m: 3, z: 1, a: 2 };
// Standard stringify may differ:
JSON.stringify(a); // '{"z":1,"a":2,"m":3}'
JSON.stringify(b); // '{"m":3,"z":1,"a":2}'
// Deterministic stringify always matches:
deterministicStringify(a); // '{"a":2,"m":3,"z":1}'
deterministicStringify(b); // '{"a":2,"m":3,"z":1}'
Redacting Sensitive Fields
// Generic redactor for logging
const SENSITIVE_KEYS = new Set([
"password", "secret", "token", "apiKey", "api_key",
"authorization", "credit_card", "ssn", "creditCard"
]);
function redactForLogging(data) {
return JSON.stringify(data, (key, value) => {
if (SENSITIVE_KEYS.has(key.toLowerCase())) {
return "[REDACTED]";
}
// Redact strings that look like JWTs
if (typeof value === "string" && /^eyJ[A-Za-z0-9_-]+\.eyJ/.test(value)) {
return "[REDACTED JWT]";
}
return value;
}, 2);
}
console.log(redactForLogging({
user: "alice",
password: "s3cret",
token: "eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiIxMjM0NTY3ODkwIn0.abc",
data: { nested: { apiKey: "sk-12345" } }
}));
// {
// "user": "alice",
// "password": "[REDACTED]",
// "token": "[REDACTED JWT]",
// "data": {
// "nested": {
// "apiKey": "[REDACTED]"
// }
// }
// }
Truncating Large Objects for Logging
// Limit string lengths and array sizes for log output
function truncatedStringify(obj, maxStringLength = 200, maxArrayLength = 10) {
return JSON.stringify(obj, (key, value) => {
if (typeof value === "string" && value.length > maxStringLength) {
return value.slice(0, maxStringLength) + `... (${value.length} chars)`;
}
if (Array.isArray(value) && value.length > maxArrayLength) {
return [
...value.slice(0, maxArrayLength),
`... and ${value.length - maxArrayLength} more items`
];
}
return value;
}, 2);
}
Creating a JSON Patch from Two Objects
// Simple diff to see what changed between two JSON-serializable objects
function jsonDiff(before, after) {
const changes = [];
const beforeJson = JSON.parse(JSON.stringify(before));
const afterJson = JSON.parse(JSON.stringify(after));
function walk(path, a, b) {
if (JSON.stringify(a) === JSON.stringify(b)) return;
if (typeof a !== typeof b || Array.isArray(a) !== Array.isArray(b)) {
changes.push({ path, before: a, after: b });
return;
}
if (typeof a === "object" && a !== null) {
const allKeys = new Set([...Object.keys(a), ...Object.keys(b)]);
for (const key of allKeys) {
walk(`${path}.${key}`, a[key], b[key]);
}
} else {
changes.push({ path, before: a, after: b });
}
}
walk("$", beforeJson, afterJson);
return changes;
}
TypeScript Type Safety with JSON
TypeScript adds static typing, but JSON parsing always returns unknown data at runtime. Here are patterns to bridge the gap safely.
Type Guards and Validation
interface User {
id: number;
name: string;
email: string;
role: "admin" | "editor" | "viewer";
}
// Type guard for runtime validation
function isUser(value: unknown): value is User {
if (typeof value !== "object" || value === null) return false;
const obj = value as Record<string, unknown>;
return (
typeof obj.id === "number" &&
typeof obj.name === "string" &&
typeof obj.email === "string" &&
["admin", "editor", "viewer"].includes(obj.role as string)
);
}
// Safe parsing with type narrowing
function parseUser(json: string): User {
const parsed: unknown = JSON.parse(json);
if (!isUser(parsed)) {
throw new Error("Invalid user data");
}
return parsed; // TypeScript knows this is User
}
Using Zod for Schema Validation
import { z } from "zod";
const UserSchema = z.object({
id: z.number().int().positive(),
name: z.string().min(1).max(100),
email: z.string().email(),
role: z.enum(["admin", "editor", "viewer"]),
createdAt: z.string().datetime().transform(s => new Date(s)),
tags: z.array(z.string()).default([])
});
type User = z.infer<typeof UserSchema>;
// Parse and validate in one step
function parseUser(json: string): User {
return UserSchema.parse(JSON.parse(json));
}
// Safe version that returns a Result type
function tryParseUser(json: string) {
try {
const raw = JSON.parse(json);
return UserSchema.safeParse(raw);
} catch {
return { success: false as const, error: new Error("Invalid JSON") };
}
}
const result = tryParseUser(inputJson);
if (result.success) {
console.log(result.data.createdAt instanceof Date); // true
} else {
console.error("Validation failed:", result.error);
}
JSON.stringify() and JSON.parse() in Different Runtimes
Node.js Specific Considerations
// Node.js can parse JSON files directly with import assertions (ES2025+)
import config from "./config.json" with { type: "json" };
// Or with require in CommonJS
const config = require("./config.json");
// For large files, use streaming (covered in the performance section)
// Node.js also provides util.inspect() for debugging output
// that handles circular references, functions, and symbols:
import { inspect } from "util";
console.log(inspect(complexObject, { depth: null, colors: true }));
Deno and Bun Compatibility
// JSON.stringify and JSON.parse work identically across all JS runtimes.
// The methods are part of the ECMAScript specification, not a runtime API.
// Deno: reading JSON files
const config = JSON.parse(await Deno.readTextFile("./config.json"));
// Bun: fast native JSON parsing
const config = JSON.parse(await Bun.file("./config.json").text());
// Bun's JSON.parse is optimized with SIMD instructions for speed
Frequently Asked Questions
What is the difference between JSON.stringify() and toString()?
JSON.stringify() converts a value to a valid JSON string, with proper quoting, escaping, and formatting. The toString() method on objects produces [object Object], which is not useful for serialization. Always use JSON.stringify() when you need a string representation that can be parsed back into a value.
Why does JSON.stringify() return undefined for some values?
JSON.stringify() returns undefined (not the string "undefined") when called with undefined, a function, or a Symbol as the root value. These types have no JSON representation. When they appear as object properties, they are silently omitted instead.
Is JSON.parse() safe to use with untrusted input?
JSON.parse() itself does not execute code, unlike eval(). It is safe from code injection. However, you must still validate the parsed data's structure and watch for prototype pollution (using __proto__ keys). Always validate parsed input against a schema before using it in your application.
How do I handle dates in JSON?
JSON has no date type. When you stringify a Date, it becomes an ISO string via toJSON(). When you parse it back, it is just a string. Use a reviver function to convert date strings back to Date objects. A common pattern is to check for ISO 8601 format strings and convert them automatically.
What is the maximum size of a JSON string?
The JSON specification has no size limit. In practice, you are limited by available memory and the JavaScript engine's string size limit. V8 (Chrome, Node.js) supports strings up to about 512 MB (1 billion characters). For larger datasets, use streaming JSON parsers.
Should I use JSON.stringify() for deep cloning?
For simple data objects (no Dates, Maps, Sets, or circular references), the JSON round-trip is still a quick, dependency-free deep clone. For anything more complex, use structuredClone(), which handles all these cases correctly and is available in all modern environments.
Quick Reference: JSON.stringify() and JSON.parse()
// ============================================
// JSON.stringify(value, replacer?, space?)
// ============================================
// Basic serialization
JSON.stringify({ a: 1, b: "hello" }); // '{"a":1,"b":"hello"}'
// Pretty print with 2 spaces
JSON.stringify(data, null, 2);
// Filter properties with array replacer
JSON.stringify(user, ["id", "name", "email"]);
// Transform with function replacer
JSON.stringify(data, (key, val) => key === "password" ? undefined : val);
// ============================================
// JSON.parse(text, reviver?)
// ============================================
// Basic parsing
JSON.parse('{"a":1,"b":"hello"}'); // { a: 1, b: "hello" }
// Restore dates with reviver
JSON.parse(json, (key, val) => {
if (/^\d{4}-\d{2}-\d{2}T/.test(val)) return new Date(val);
return val;
});
// Filter properties with reviver
JSON.parse(json, (key, val) => key.startsWith("_") ? undefined : val);
// ============================================
// Common patterns
// ============================================
// Deep clone (simple data only)
const clone = JSON.parse(JSON.stringify(original));
// Deep clone (modern, handles more types)
const clone = structuredClone(original);
// localStorage read/write
localStorage.setItem("key", JSON.stringify(data));
const data = JSON.parse(localStorage.getItem("key"));
// Safe parse with fallback
const data = (() => { try { return JSON.parse(str); } catch { return null; } })();
// Circular reference safe stringify
const seen = new WeakSet();
JSON.stringify(obj, (k, v) => {
if (typeof v === "object" && v !== null) {
if (seen.has(v)) return "[Circular]";
seen.add(v);
}
return v;
});
Conclusion
JSON.stringify() and JSON.parse() are deceptively deep methods. At the surface, they convert between objects and strings. Below that, they offer replacer and reviver functions for fine-grained control, handle a complex set of edge cases around unsupported types, and carry real security implications when used with untrusted data.
The key takeaways from this guide:
- Always handle edge cases:
undefined, functions, symbols, and BigInt are not JSON-serializable and require explicit handling. - Use the replacer for output control: filtering sensitive fields, transforming values, and serializing custom types.
- Use the reviver for input restoration: converting ISO strings to Dates, reconstructing Maps and Sets, and stripping dangerous keys.
- Prefer
structuredClone()over JSON round-trip for deep cloning in modern code. - Validate all untrusted JSON input against a schema (zod, joi, ajv) to prevent prototype pollution and ensure data integrity.
- For large datasets, consider streaming parsers, worker threads, and binary alternatives like MessagePack.
JSON remains the lingua franca of data interchange on the web. A thorough understanding of its JavaScript API is not optional knowledge for web developers; it is foundational.