How Shallow Copying Creates Reference Traps in Nested Data
Shallow copying with spread syntax only copies the first level of object properties, leaving nested objects as shared references. Understanding this limitation prevents subtle bugs in complex data structures and helps developers choose between shallow copying, deep copying, or immutable libraries like Immer.
TL;DR
- Use
{...obj}
for flat objects, but beware of nested references- Shallow copying shares references to nested objects and arrays
- Perfect for simple objects, dangerous for complex nested data
- Consider deep copy libraries for multi-level object structures
const shallowCopy = { ...original }
The Nested Object Reference Trap
You're building a user management system where admins can modify user profiles. The current shallow copying approach works fine for simple updates, but creates dangerous shared references when dealing with nested objects like addresses or preferences.
// The shallow copy trap - shared nested references
const originalUser = {
id: 123,
name: 'John Doe',
address: { street: '123 Main St', city: 'NYC', zip: '10001' },
preferences: { theme: 'dark', notifications: true },
}
function updateUserShallow(user, changes) {
// DANGER: Shallow copy shares nested object references!
const updated = { ...user, ...changes }
console.log('Shallow copy created')
console.log('Same address reference?', updated.address === user.address)
return updated
}
// Test the dangerous approach
const shallowUpdate = updateUserShallow(originalUser, { name: 'Jane Doe' })
Proper nested copying requires manual spread at each level or deep copy utilities:
// The safe nested copying solution
const user = {
id: 123,
name: 'John',
address: { street: '123 Main St', city: 'NYC' },
preferences: { theme: 'dark', notifications: true },
}
function updateUserWithSafeCopy(user, changes) {
return {
...user,
...changes,
address: { ...user.address },
preferences: { ...user.preferences },
}
}
console.log('Safe copy:', updateUserWithSafeCopy(user, { name: 'Jane' }))
Best Practises
Use shallow copying when:
- ✅ Objects have only primitive properties (strings, numbers, booleans)
- ✅ You need fast copying and nested objects won't be modified
- ✅ Working with flat configuration objects or simple state
- ✅ Performance is critical and you understand the reference implications
Avoid shallow copying when:
- 🚩 Objects contain nested objects, arrays, or functions
- 🚩 Multiple components will modify the same nested data
- 🚩 You need true isolation between original and copy
- 🚩 Data structures have unknown depth or complexity
Avoid when:
- 🚩 Simple use case without complexity
- 🚩 Performance is critical
- 🚩 Team unfamiliarity with pattern
- 🚩 Legacy browser support needed
System Design Trade-offs
Aspect | Shallow Copy | Deep Copy | JSON Parse/Stringify |
---|---|---|---|
Performance | Fastest - copies references | Slower - recursive copying | Slow - serialization overhead |
Memory Usage | Efficient - shared references | Higher - duplicates everything | Highest - string conversion |
Nested Safety | Dangerous - shared references | Safe - isolated copies | Safe - new objects created |
Function Support | Preserves functions | Depends on library | Lost - functions not serializable |
Browser Support | ES2018+ required | Depends on library | Universal support |
Use Case | Flat objects, performance | Complex nested data | Simple data without functions |
More Code Examples
❌ Shallow copy reference nightmare
// Shallow copy approach - dangerous shared references in nested data
function processComplexDataShallow(inputData) {
if (!inputData) {
throw new Error('Input data required')
}
// Sample complex data with nested structures
const complexInput = {
metadata: {
created: new Date().toISOString(),
author: { id: 123, name: 'John Doe' },
tags: ['processing', 'data', 'analysis'],
},
datasets: [
{ id: 1, values: [10, 20, 30], config: { normalized: false } },
{ id: 2, values: [40, 50, 60], config: { normalized: false } },
],
results: {
totals: { sum: 0, count: 0 },
processed: false,
},
}
// DANGER: Shallow copy shares all nested references!
const processedData = { ...complexInput }
// Processing seems safe but creates hidden mutations
processedData.results.processed = true
processedData.results.totals.count = processedData.datasets.length
// Modify nested arrays - affects original!
processedData.datasets.forEach((dataset) => {
const sum = dataset.values.reduce((a, b) => a + b, 0)
processedData.results.totals.sum += sum
dataset.config.normalized = true // MUTATES ORIGINAL!
})
// Add processing timestamp
processedData.metadata.processed = new Date().toISOString()
console.log('Shallow copy processing completed')
console.log('Original data mutated:', complexInput.datasets[0].config.normalized)
console.log('Same metadata reference:', processedData.metadata === complexInput.metadata)
console.log('Same datasets reference:', processedData.datasets === complexInput.datasets)
// The "copy" shares references - original data is corrupted!
return processedData
}
// Test the dangerous shallow copy
const originalData = {
metadata: { created: '2025-01-01', author: { id: 123, name: 'John' } },
datasets: [{ id: 1, values: [1, 2, 3], config: { normalized: false } }],
results: { totals: { sum: 0, count: 0 }, processed: false },
}
const testResult = processComplexDataShallow(originalData)
console.log('Processing complete - but original data was corrupted!')
console.log('Original data normalized flag:', originalData.datasets[0].config.normalized)
console.log('Shallow copy reference sharing caused mutation bugs')
✅ Deep copy isolation safety
// Deep copy approach - safe isolation for nested data
function recursiveDeepClone(obj) {
if (obj === null || typeof obj !== 'object') return obj
if (obj instanceof Date) return new Date(obj.getTime())
if (Array.isArray(obj)) return obj.map(recursiveDeepClone)
const copy = {}
Object.keys(obj).forEach((key) => (copy[key] = recursiveDeepClone(obj[key])))
return copy
}
function processComplexDataSafely(inputData) {
if (!inputData) throw new Error('Input required')
// Safe: Deep copy creates isolated data
const processedData = recursiveDeepClone(inputData)
// Safe processing - no mutations to original
processedData.results.processed = true
processedData.results.totals.count = processedData.datasets.length
processedData.datasets.forEach((dataset) => {
const sum = dataset.values.reduce((a, b) => a + b, 0)
processedData.results.totals.sum += sum
dataset.config.normalized = true // Only affects copy!
})
processedData.metadata.processed = new Date().toISOString()
processedData.metadata.version = (processedData.metadata.version || 0) + 1
console.log('Deep copy safe:', inputData.datasets[0].config.normalized)
console.log('Different refs:', processedData.metadata !== inputData.metadata)
return processedData
}
// Alternative: JSON parse/stringify for simple deep copy
function jsonBasedDeepClone(obj) {
return JSON.parse(JSON.stringify(obj)) // Loses functions, undefined
}
// Test the safe approach
const testData = {
metadata: { created: '2025-01-01', author: { id: 123, name: 'John' } },
datasets: [{ id: 1, values: [1, 2, 3], config: { normalized: false } }],
results: { totals: { sum: 0, count: 0 }, processed: false },
}
const safeResult = processComplexDataSafely(testData)
console.log('Safe complete - original preserved!')
console.log('Original flag:', testData.datasets[0].config.normalized)
Technical Trivia
The Airbnb Booking Corruption Incident (2019): Airbnb's booking system experienced widespread data corruption when shallow copying was used for reservation modifications. Guest updates to booking details accidentally modified shared nested objects, causing other guests' reservations to show incorrect dates, prices, and amenities - affecting over 50,000 bookings before detection.
Why shallow copying failed catastrophically: The booking objects contained nested references for property details, pricing breakdowns, and guest preferences. When one booking was "copied" and modified, the shared references meant changes propagated to all bookings referencing the same nested objects, creating a cascade of data corruption across the entire reservation system.
Modern solutions prevent reference disasters: Libraries like Immer provide structural sharing with copy-on-write semantics, TypeScript can enforce immutability constraints, and deep equality checks in tests catch shared reference bugs early. Teams now use { ...obj }
only for flat objects and switch to proper deep copying for complex nested data structures.
Master Safe Copying: Choose the Right Strategy for Your Data
Use shallow copying only for flat objects with primitive values where performance matters most. For nested data, choose deep copying libraries like Lodash's cloneDeep, Immer for immutable updates, or JSON.parse/stringify for simple structures. The { ...obj }
pattern is excellent for React props and simple state, but dangerous for complex data - understand your data structure depth before choosing a copying strategy.