I think this has a simple answer, but I haven't been able to crack it. Say you have a URL that needs to be broken down into four parts:
comp1 = 'www.base.com/'
comp2 = list1 # a list of letters, say "AAA", "BBB", "CCC"
comp3 = list2 # another list, but this time dates: '2019/10/21', '2019/10/20', '2019/10/19'
comp4 = "/example.html"
I have tried to combine these in multiple different ways, and I know urllib.parse.urljoin is the best option, but it cannot take more than three arguments:
for i in comp2:
iter1 = urllib.parse.urljoin(comp1, i)
print(iter1) # this pairs the first two components nicely
for j in comp3:
iter2 = urllib.parse.urljoin(j, comp4)
print(iter2) # this just returns '/example.html', and nothing is joined
What's the most pythonic way to join these 4 components? I've tried ''.join(), but that only takes one argument. I'm going to have a lot more than just three iterations. In R, I would just slam my components in a paste0() and call it a night.
www.base.com/AAABBBCCC/2019/10/212019/10/20/example.htmlif you can provide exactly what you are looking for it will help get an answer. :) It looks like you are just trying to combine strings and lists, maybe this will help? stackoverflow.com/questions/12633024/…reducein combination withurljoinseems like a good option, see here.