7

I keep running into situations where I end up nesting very many reduce functions to drill down into an object. It's hard to pull out the logic because at the bottom I need access to the various keys traversed along the way. Essentially, I'm looking for a better way to achieve the following:

import { curry } from 'lodash/fp'
import { fromJS } from 'immutable'

const reduce = curry((fn, acc, it) => it.reduce(fn, acc))

describe('reduceNested', () => {
  const input = fromJS({
    a1: {
      b1: {
        c1: {
          d1: {
            e1: 'one',
            e2: 'two',
            e3: 'three'
          },
          d2: {
            e1: 'one',
            e2: 'two',
            e3: 'three'
          }
        },
        c2: {
          d1: {
            e1: 'one',
            e2: 'two'
          }
        }
      }
    },
    a2: {
      b1: {
        c1: {
          d1: {
            e1: 'one'
          },
          d2: {
            e1: 'one'
          }
        }
      },
      b2: {
        c1: {
          d1: {
            e1: 'one'
          },
          d2: {
            e1: 'one'
          }
        }
      }
    },
    a3: {
      b1: {
        c1: {}
      }
    }
  })

  const expected = fromJS({
    one: [
      'a1.b1.c1.d1.e1',
      'a1.b1.c1.d2.e1',
      'a1.b1.c2.d1.e1',
      'a2.b1.c1.d1.e1',
      'a2.b1.c1.d2.e1',
      'a2.b2.c1.d1.e1',
      'a2.b2.c1.d2.e1'
    ],
    two: ['a1.b1.c1.d1.e2', 'a1.b1.c1.d2.e2', 'a1.b1.c2.d1.e2'],
    three: ['a1.b1.c1.d1.e3', 'a1.b1.c1.d2.e3']
  })

  const init = fromJS({ one: [], two: [], three: [] })

  test('madness', () => {
    const result = reduce(
      (acc2, val, key) =>
        reduce(
          (acc3, val2, key2) =>
            reduce(
              (acc4, val3, key3) =>
                reduce(
                  (acc5, val4, key4) =>
                    reduce(
                      (acc6, val5, key5) =>
                        acc6.update(val5, i =>
                          i.push(`${key}.${key2}.${key3}.${key4}.${key5}`)
                        ),
                      acc5,
                      val4
                    ),
                  acc4,
                  val3
                ),
              acc3,
              val2
            ),
          acc2,
          val
        ),
      init,
      input
    )

    expect(result).toEqual(expected)
  })

  test('better', () => {
    const result = reduceNested(
      (acc, curr, a, b, c, d, e) =>
        acc.update(curr, i => i.push(`${a}.${b}.${c}.${d}.${e}`)),
      init,
      input
    )

    expect(result).toEqual(expected)
  })
})

I would like to write a function reduceNested that achieves the same result but without all of the nested reduce functions. I don't see something in lodash/fp or similar to address so my thought was to create a new function reduceNested and to add variables to the callback for each key in the tree. I've tried implementing the actual logic but am unfortunately stuck for the time being. I know reduceNested will need to use fn.length to determine how far down into the source to drill, but other than that I'm just stuck.

const reduceNested = curry((fn, acc, iter) => {
  // TODO --> use (fn.length - 2)
})
2
  • Recursion might work. Can you explain exactly what are trying to achieve? Commented Oct 14, 2018 at 22:13
  • You're using a function meant to process flat arrays on a graph. You don't want reduce you want some form of recursion. Commented Oct 14, 2018 at 22:14

4 Answers 4

3

functional style

You were on the right track with your answer, however recurring based on the user-supplied procedure's length is a misstep. Instead, the variable-length path should be passed as a single, variable-length value – an array

const reduceTree = (proc, state, tree, path = []) =>
  reduce                        // call reduce with:
    ( (acc, [ key, value ]) =>  // reducer
        isObject (value)               // value is an object (another tree):
          ? reduceTree                 //   recur with:
              ( proc                   //     the proc
              , acc                    //     the acc
              , value                  //     this value (the tree)
              , append (path, key)     //     add this key to the path
              )                        // value is NOT an object (non-tree):
          : proc                       //   call the proc with:
              ( acc                    //     the acc
              , value                  //     this value (non-tree, plain value)
              , append (path, key)     //     add this key to the path
              )
    , state                     // initial input state 
    , Object.entries (tree)     // [ key, value ] pairs of input tree
    )

Free values above are defined to use prefix notation, which is more familiar in functional style –

const isObject = x =>
  Object (x) === x

const reduce = (proc, state, arr) =>
  arr .reduce (proc, state)

const append = (xs, x) =>
  xs .concat ([ x ])

Now we have a generic reduceTree function –

const result =
  reduceTree
    ( (acc, value, path) =>           // reducer
        [ ...acc, { path, value } ] 
    , []                              // initial state
    , input                           // input tree
    )

console.log (result)
// [ { path: [ 'a1', 'b1', 'c1', 'd1', 'e1' ], value: 'one' }
// , { path: [ 'a1', 'b1', 'c1', 'd1', 'e2' ], value: 'two' }
// , { path: [ 'a1', 'b1', 'c1', 'd1', 'e3' ], value: 'three' }
// , { path: [ 'a1', 'b1', 'c1', 'd2', 'e1' ], value: 'one' }
// , { path: [ 'a1', 'b1', 'c1', 'd2', 'e2' ], value: 'two' }
// , { path: [ 'a1', 'b1', 'c1', 'd2', 'e3' ], value: 'three' }
// , { path: [ 'a1', 'b1', 'c2', 'd1', 'e1' ], value: 'one' }
// , { path: [ 'a1', 'b1', 'c2', 'd1', 'e2' ], value: 'two' }
// , { path: [ 'a2', 'b1', 'c1', 'd1', 'e1' ], value: 'one' }
// , { path: [ 'a2', 'b1', 'c1', 'd2', 'e1' ], value: 'one' }
// , { path: [ 'a2', 'b2', 'c1', 'd1', 'e1' ], value: 'one' }
// , { path: [ 'a2', 'b2', 'c1', 'd2', 'e1' ], value: 'one' } 
// ]

We can shape the output of the result however we like –

const result =
  reduceTree
    ( (acc, value, path) =>                        // reducer
        ({ ...acc, [ path .join ('.') ]: value })
    , {}                                           // initial state
    , input                                        // input tree
    )

console.log (result)
// { 'a1.b1.c1.d1.e1': 'one'
// , 'a1.b1.c1.d1.e2': 'two'
// , 'a1.b1.c1.d1.e3': 'three'
// , 'a1.b1.c1.d2.e1': 'one'
// , 'a1.b1.c1.d2.e2': 'two'
// , 'a1.b1.c1.d2.e3': 'three'
// , 'a1.b1.c2.d1.e1': 'one'
// , 'a1.b1.c2.d1.e2': 'two'
// , 'a2.b1.c1.d1.e1': 'one'
// , 'a2.b1.c1.d2.e1': 'one'
// , 'a2.b2.c1.d1.e1': 'one'
// , 'a2.b2.c1.d2.e1': 'one'
// }

The input for our test should demonstrate that reduceTree works for various levels of nesting –

test ('better', () => {
  const input =
    { a: { b: { c: 1, d: 2 } }, e: 3 }

  const expected =
    { 'a.b.c': 1, 'a.b.d': 2, e: 3 }

  const result =
    reduceTree
      ( (acc, value, path) =>
          ({ ...acc, [ path .join ('.') ]: value })
      , {}
      , input 
      )

  expect(result).toEqual(expected)
})

Lastly, verify the program works in your browser below –

const isObject = x =>
  Object (x) === x

const reduce = (proc, state, arr) =>
  arr .reduce (proc, state)

const append = (xs, x) =>
  xs .concat ([ x ])

const reduceTree = (proc, state, tree, path = []) =>
  reduce
    ( (acc, [ key, value ]) =>
        isObject (value)
          ? reduceTree
              ( proc
              , acc
              , value
              , append (path, key)
              )
          : proc
              ( acc
              , value
              , append (path, key)
              )
    , state
    , Object.entries (tree)
    )

const input =
  { a: { b: { c: 1, d: 2 } }, e: 3 }

const result =
  reduceTree
    ( (acc, value, path) =>
        [ ...acc, { path, value } ]
    , []
    , input
    )

console.log (result)
// { 'a.b.c': 1, 'a.b.d': 2, e: 3 }


… with the help of some friends

Imperative-style generators make light work of this kind of task while offering an intuitive language to describe the intended process. Below we add traverse which generates [ path, value ] pairs for a nested tree (object) –

const traverse = function* (tree = {}, path = [])
{ for (const [ key, value ] of Object.entries (tree))
    if (isObject (value))
      yield* traverse (value, append (path, key))
    else
      yield [ append (path, key), value ]
}

Using Array.from we can plug the generator directly into our existing functional reduce; reduceTree is now just a specialization –

const reduceTree = (proc, state, tree) =>
  reduce
    ( (acc, [ path, value ]) =>
        proc (acc, value, path)
    , state
    , Array.from (traverse (tree))
    )

The call site is the same –

const input =
  { a: { b: { c: 1, d: 2 } }, e: 3 }

const result =
  reduceTree
    ( (acc, value, path) =>
        ({ ...acc, [ path .join ('.') ]: value })
    , {}
    , input
    )

console.log (result)
// { 'a.b.c': 1, 'a.b.d': 2, e: 3 }

Verify the result in your browser below –

const isObject = x =>
  Object (x) === x

const reduce = (proc, state, arr) =>
  arr .reduce (proc, state)

const append = (xs, x) =>
  xs .concat ([ x ])

const traverse = function* (tree = {}, path = [])
{ for (const [ key, value ] of Object.entries (tree))
    if (isObject (value))
      yield* traverse (value, append (path, key))
    else
      yield [ append (path, key), value ]
}

const reduceTree = (proc, state, tree) =>
  reduce
    ( (acc, [ path, value ]) =>
        proc (acc, value, path)
    , state
    , Array.from (traverse (tree))
    )

const input =
  { a: { b: { c: 1, d: 2 } }, e: 3 }

const result =
  reduceTree
    ( (acc, value, path) =>
        ({ ...acc, [ path .join ('.') ]: value })
    , {}
    , input
    )

console.log (result)
// { 'a.b.c': 1, 'a.b.d': 2, e: 3 }

Sign up to request clarification or add additional context in comments.

3 Comments

Thanks so much! I wish I could give you extra points for Object.entries as I now realize what should have been obvious to me and no longer have need for a reduceKeyed or for converting various lodash/fp functions to include a key parameter on the iterating function. Yes, my test example should have tested for multiple use cases as it would not have led many here to answer a question that was technically not asked but that seems more familiar to experts. I feel poorly about that, but hey, I'm still learning. Thanks again.
Please do not feel badly. Your posted answer was on-track; just compare how close the two programs are. It takes a lot of practice before gaining an intuition about recursive data structures. The other answer here effectively does the same thing using imperative-style generators; also nice! – One of my favourite topics on SO is recursion, so maybe you'll find some of that stuff useful too :D
I added a small update of how I would use generators for this
2

You can use recursion which is ideal for this kind of traversing, like so:

function traverse(input, acc, path = []) {                       // path will be used internally so you don't need to pass it to get from the outside, thus it has a default value
    Object.keys(input).forEach(key => {                          // for each key in the input
        let newPath = [...path, key];                            // the new path is the old one + the current key
        if(input[key] && typeof input[key] === "object") {       // if the current value (at this path) is an object
            traverse(input[key], acc, newPath);                  // traverse it using the current object as input, the same accumulator and the new path
        } else {                                                 // otherwise (it's not an object)
            if(acc.hasOwnProperty(input[key])) {                 // then check if our accumulator expects this value to be accumulated
                acc[input[key]].push(newPath.join('.'));         // if so, add its path to the according array
            }
        }
    });
}

let input = {"a1":{"b1":{"c1":{"d1":{"e1":"one","e2":"two","e3":"three"},"d2":{"e1":"one","e2":"two","e3":"three"}},"c2":{"d1":{"e1":"one","e2":"two"}}}},"a2":{"b1":{"c1":{"d1":{"e1":"one"},"d2":{"e1":"one"}}},"b2":{"c1":{"d1":{"e1":"one"},"d2":{"e1":"one"}}}},"a3":{"b1":{"c1":{}}}};
let acc = { one: [], two: [], three: [] };

traverse(input, acc);

console.log(acc);

1 Comment

Thank you for your answer, it really helped me to understand the problem. I didn't pick it because, as with all of the other answers, it takes a procedural route to get to the specific solution that was provided as an example use case, but without providing a function api for reuse. The question was not about how to arrive at a new object with the values as keys and paths as values, but about how to implement a function that gives the caller access to keys as variables to manipulate as desired.
1

I would solve this problem using a recursive generator function

In this example, I've created a separate function childPathsAndValues. Here we've achieved separation of concerns: this function doesn't need to know that you're appending each path to an array. It simply traverses the object and returns the path/value combinations.

function* childPathsAndValues(o) {
   for(let k in o) {
     if(typeof(o[k]) === 'object') {
	  for(let [childPath, value] of childPathsAndValues(o[k])) {
  	      yield [`${k}.${childPath}`, value];
          }
     } else {
        yield [k, o[k]];
     }
  }
}

const input = {"a1":{"b1":{"c1":{"d1":{"e1":"one","e2":"two","e3":"three"},"d2":{"e1":"one","e2":"two","e3":"three"}},"c2":{"d1":{"e1":"one","e2":"two"}}}},"a2":{"b1":{"c1":{"d1":{"e1":"one"},"d2":{"e1":"one"}}},"b2":{"c1":{"d1":{"e1":"one"},"d2":{"e1":"one"}}}},"a3":{"b1":{"c1":{}}}};
  
const acc = {};
for(let [path, value] of childPathsAndValues(input)) {
   console.log(`${path} = ${value}`);
   acc[value] = acc[value] || [];
   acc[value].push(path);
}

console.log('*** Final Result ***');
console.log(acc);

1 Comment

Thank you for your answer, it really helped me to understand the problem. I didn't pick it because, as with all of the other answers, it takes a procedural route to get to the specific solution that was provided as an example use case, but without providing a function api for reuse. The question was not about how to arrive at a new object with the values as keys and paths as values, but about how to implement a function that gives the caller access to keys as variables to manipulate as desired.
0

As the other answers indicate, recursion is the key; however, instead of writing and rewriting procedural code that mutates your data, and that needs to be hand-tooled to each and every situation, why not use, and reuse, this function wherever the need arises.

Vanilla Javascript:

import { curry, __ } from 'lodash/fp'
const reduce = require('lodash/fp').reduce.convert({ cap: false })
reduce.placeholder = __


const reduceNested = curry((fn, acc, iter, paths) =>
  reduce(
    (acc2, curr, key) =>
      paths.length === fn.length - 3
        ? fn(acc2, curr, ...paths, key)
        : reduceNested(fn, acc2, curr, [...paths, key]),
    acc,
    iter
  )
)

export default reduceNested

Usage:

test('better', () => {
  const result = reduceNested(
    (acc, curr, a, b, c, d, e) => ({
      ...acc,
      [curr]: [...acc[curr], `${a}.${b}.${c}.${d}.${e}`]
    }),
    init,
    input,
    []
  )

  expect(result).toEqual(expected)
})

With Immutable.js:

import { curry } from 'lodash/fp'

const reduce = curry((fn, acc, it) => it.reduce(fn, acc))

const reduceNested = curry((fn, acc, iter, paths) =>
  reduce(
    (acc2, curr, key) =>
      paths.size === fn.length - 3
        ? fn(acc2, curr, ...paths, key)
        : reduceNested(fn, acc2, curr, paths.push(key)),
    acc,
    iter
  )
)

export default reduceNested

Usage:

test('better', () => {
  const result = reduceNested(
    (acc, curr, a, b, c, d, e) =>
      acc.update(curr, i => i.push(`${a}.${b}.${c}.${d}.${e}`)),
    init,
    input,
    List()
  )

  expect(result).toEqual(expected)
})

3 Comments

it's impractical to expect the user-supplied reducer to know the depth of the structure it's reducing and therefore have to specify a parameter for each key in the path – passing the path as a variable-length array to the reducer is a better design
@user633183 I agree that a variable-depth solution may be desirable to make too, but it's a different problem; this exact fix-depth problem is something I need in several places, hence the question, and so it is certainly useful to me right now. I don't know if it is possible to make a variable-depth variation that would also solve fix-depth problems.
I added an answer. Let me know if it helps :D

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.