0

I am using UploadCsv method of BigQuery Client class. This class accepts Stream class object. Can I change encoding of my file and pass this file to stream object without converting my text file to ByteArray? I'm using a large file and in case of ByteArray it gives out of memory exception.

UploadCsvAsync(string datasetId, string tableId, TableSchema schema, Stream input, UploadCsvOptions options = null, CancellationToken cancellationToken = null);

1 Answer 1

0

It seems that it is not possible to change the encoding of your input. This is because the UploadCSVAsync requires the input to be a System.IO.Stream class.

This class is the abstract base class of all streams in C#. A stream, on the other hand is a sequence of bytes, such as a file, input/output devices, an inter-process communication pipe or a TCP/IP socket. If you changed the type or encoding of the input, it might not match with the System.IO.Stream remarks.

This covers the main question "If it possible to change the encoding?", the answer is No.

Now, the next you may ask should be: "If it is not possible, then what do we need to do?", and that's a great question!

For that particular matter, the Out of memory Exception Error message can be solved by changing the way we are reading the data. Having a single loop statement to read a single file might not be the best option due to the allocated memory.

Instead, use a buffer to read data as they did on this other question or use smaller arrays or jagged arrays as done on this question.

Hope this is helpful! :)

Sign up to request clarification or add additional context in comments.

1 Comment

If you are still having issues with the Out of memory Exception, I would recommend to create a new question to focus on it

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.