2

I have written some code that seems to explode in terms of memory.

I don't understand why since most of the objects are created in supplementary methods and I would expect the space to be free after the end of the method.?!?(or not?)

I am rather new to the memory consumption subject. And I do not know what to do to improve it.

Configuring the JVM by adding the flag -Xmx8192 did not help. It only got me to process 3 more packages. (initially27 packages processed with -Xmx flag I reached 30)

Could I delay it somehow to give time to the GC to free space? Or would this not help?

Here is the code I have written so far:

    public class mainToTest{

        public static void main(String [] args)throws IOException{        String str;       
          String home = "C:/Users/Eirini/Desktop/OP/";      
          String s = "13092017-1800";       
          File Descriptor;          
          final Charset ENCODING = StandardCharsets.UTF_8;      
          Path path = Paths.get(home+"output.xml");         
          List <String> LOP= new ArrayList();
          LOP.clear();      

          List<String> lines;
          int i,j;

          File [] a =(new File(home+s)).listFiles();          
          System.out.println("Packages found...");      
          for (i=0; i<a.length; i++){           
            System.out.print("For package " + i);             
            Descriptor=findDescriptor(a[i]);             
            XSL.transformation(Descriptor,new File
            (home+"extractprocedureid.xslt"));

            lines  = Files.readAllLines(path, ENCODING);             
            str=lines.get(1);           
            if (LOP.isEmpty()){
            LOP.add(str);}          
            for(j=0; j<LOP.size(); j++){
            if(!(str.equals(LOP.get(j)))){
                   LOP.add(str);}           
            }
          }         
          System.out.println("");       
          System.out.println("Finished Procedures found:");         
          for (i=0; i<LOP.size();i++){          
             System.out.println(LOP.get(i)); }


    }


        public static File findDescriptor(File pckg){       
           String s;        
           int i,k;         
           int j=0;
           File[] ind=pckg.listFiles();         
           System.out.println(" all Items and descriptor listed");

           k=ind.length;

           for (i=0;i<k;i++){           
               System.out.println("File " +i);          
               if (ind[i].getName().endsWith("_immc.xml")){
                  j=i;
                  i=200;
                  System.out.println("Descriptor found !!!!");                 }            
               else{
                 System.out.println(" not a descriptor. Moving to the next");}      }       

            return ind[j];              

        } 
}

and the XSL.transformation looks like that

public static void transformation (File immc,File xslt){

        Source xmlInput = new StreamSource(immc);
        Source xsl = new StreamSource(xslt);
        Result xmlOutput = new StreamResult(new File("C:/Users/Eirini/Desktop/OP/output.xml"));

        try {
                Transformer transformer = TransformerFactory.newInstance().newTransformer(xsl);
            transformer.transform(xmlInput, xmlOutput);
        } 
        catch (TransformerException e) {
                System.out.println("Exception caught");
        }
        System.out.println("XSLT transformation finnished...The result can be found in file C:/Users/Eirini/Desktop/OP/output.xml");

    }

The error usually happens right after the XSL.transformation (2nd piece of code)

Thanks

6
  • Just profile it. By the way -Xmx8192 gives just 8kb of memory, try -Xmx8192m instead. (But I'll assume it was just a typo.) Commented Oct 8, 2017 at 22:35
  • How large is the output of these transformed files? For every file that you transform, you are reading the lines of text and appending to the LOP ArrayList. So, you will be holding the output of all of these transformed files in memory. Try not appending to your LOP to see if things remain constant, or invoke LOP.clear() after each file to see if you stop throwing OOM errors and then tune from there. Commented Oct 8, 2017 at 22:41
  • yes my exact command was java -Xmx2048m -Xms512m mainToTest Commented Oct 8, 2017 at 22:44
  • the files after the transformation is 2-5 lines Commented Oct 8, 2017 at 22:45
  • lexicore what do you mean by profile it? Commented Oct 8, 2017 at 22:46

1 Answer 1

3

Looks like the problem is the lines:

   for(j=0; j<LOP.size(); j++){
       if(!(str.equals(LOP.get(j)))){
          LOP.add(str);}           
       }

This snippet will double the size of the list LOP every time you have a new value for str. So you have an exponential memory usage just there.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.