0

I have two XML files, The first XML file include list of items. The second XML file include list of review details. I need to find if review exists in the item list. For the mission I have a for loop that run on all of the items and for etch item check with another for loop if it exists in the review list (The second XML). Because this XML contain lo.ts of data its take a lot of time The question is if there another way to achieve this with more efficiency.

example:

for x in price.iter('Item'): # First XML with the items
    for z in promo.iter('Review'): # Second XML with the reviews
        if z.find('ItemCode').text == x.find('ItemCode').text:
            print True # Find the data in the review
3
  • 1
    Why are you leaving us to imagine your setup? Please include some code to illustrate what you're doing. You should include an MCVE Commented Jun 23, 2018 at 14:26
  • It looks like you want to load the contents into dictionaries. I appreciate you included your code, can you also give a small sample of each file? Commented Jun 23, 2018 at 14:34
  • If you wrote it in XQuery, there's a good chance your XQuery processor would do the optimization automatically. Commented Jun 23, 2018 at 18:32

1 Answer 1

0

Well, to check if an item exists in the list, you can simply check by using:

if x in List:
    do something

The "if x in List" will return a Boolean, true or false. If you need access to an item in the list, I would recommend checking out a similar question here:

Python: how to check that if an item is in a list efficiently?

Best, Dhvani

Sign up to request clarification or add additional context in comments.

2 Comments

And that still iterates the entire list in the worst case. This is no different, really, than if the OP included a break in their loop. This probably isn't much more efficient than what they're doing.
Well, if the list is sorted, it will save some time. But yes, it is still O(n).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.