2

I am trying to write my dictionary in a sf file, here is how i am doing it:

with open(path3 + 'extend.csv', 'w') as fw:
    for key1, value in d.items():
        fw.write(value +',')
    fw.write('\n')
fw.close()

The thing is that I get an extra comma at the end of each line. How can I prevent this form happing in the first place?

1
  • As a side note, dict item order is not maintained. This approach is fine if you don't care what order you're putting those values into the csv file, but I think for most people they do care. Either iterate over specific dict keys or switch to a list for d (which maintains order). Commented Sep 5, 2017 at 19:14

2 Answers 2

4

Use the csv module: https://docs.python.org/2/library/csv.html

import csv
with open(path3 + 'extend.csv', 'w') as fw:
    writer = csv.writer(fw, delimiter=',')
    writer.writerow(d.values())
Sign up to request clarification or add additional context in comments.

2 Comments

Can I substitute some computed double value for d.values() here? What's the origin and structure of the referenced d.values() in this context? A dictionary?
Yes, writer.writerow works with a list of strings or numbers are argument (docs.python.org/2/library/csv.html#csv.csvwriter.writerow) or in newer versions with an iterable providing the values for the rows (docs.python.org/3/library/csv.html#csv.csvwriter.writerow). d is the dictionary of the code snippet in the questoin.
2

You can use ",".join to do this. Also, you can directly access the values using d.values(), so you don't really need to do iterate over the dict and then extract the values.

So essentially, your code can be simplified to:

with open(path3 + 'extend.csv', 'w') as fw:
    fw.write(",".join(d.values()))
    fw.write('\n')
fw.close()

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.