The following object is used as a starting point to populate options of a select box in the UI of an app:
const months = {
"1": "Jan",
"2": "Feb",
"3": "Mar",
"4": "Apr",
"5": "May",
"6": "Jun",
"7": "Jul",
"8": "Aug",
"9": "Sep",
"10": "Oct",
"11": "Nov",
"12": "Dec"
}
The options for the select box must be limited though based on the contents of another array. Here is an example of that array:
const existingMonths = [
1,
2,
3,
4,
5,
6,
7,
8,
12
];
So in this example, the object of final options for the select box should be:
const availableMonths = {
"9": "Sep",
"10": "Oct",
"11": "Nov"
}
I'm having difficulty figuring out how to build the availableMonths object. This does not give the desired output:
const availableMonths = Object.entries(months).filter(k => !existingMonths.includes(k));
kis an array with two entries - the key first, the value second. If you want to check if the key exists in another array, you need to just take the first element. Also, the key would be a string, while theexistingMonthsarray contains numbers. You need to convert the two to the same type.