0

I have a text file having some Hindi characters and my default character encoding in ISO 8859-1. I am using "FileInputStream" to read the data from that file and "FileOutputStream" to write data to another text file.

My code is:

    FileInputStream fis = new FileInputStream("D:/input.txt");
    int i = -1;
    FileOutputStream fos = new FileOutputStream("D:/outputNew.txt");
    while((i = fis.read())!= -1){
        fos.write(i);
    }
    fos.flush();
    fos.close();
    fis.close();

I am not specifying encoding ("UTF-8") anywhere, but still the output file in having proper text.How it is happening , i am not getting?

2
  • Your system default charset is Latin-1, but what's Java's default charset...? Commented Dec 15, 2012 at 16:45
  • @Makoto : How to find Java's default charset, i am using "Charset.defaultCharset()" which is printing "ISO-8859-1"? Commented Dec 15, 2012 at 16:53

1 Answer 1

8

It's working because you don't use any char in your program. You're just transferring raw bytes from one file to another. It would be a problem if you read and wrote characters, because then an encoding would be used to transform the bytes in the files to characters, and vice-versa.

Sign up to request clarification or add additional context in comments.

2 Comments

Even if i am using "reader" and "writer", i am getting the same problem i.e. I am not specifying encoding ("UTF-8") anywhere, but still the output file in having proper text.
That means that you're lucky to have bytes in the file which represent Indi characters encoded in UTF-8, and that these bytes happen to also represent valid characters encoded in latin-1. A bit like if you had an email written in English that uses only words that also exist in French.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.