JAXP00010004 and java.lang.OutOfMemoryError: GC overhead limit exceeded











up vote
0
down vote

favorite












I have a maven project when i need to parse abig rdf file.



my code is :



import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;

import org.eclipse.rdf4j.model.Model;
import org.eclipse.rdf4j.model.Statement;
import org.eclipse.rdf4j.model.impl.LinkedHashModel;
import org.eclipse.rdf4j.rio.RDFFormat;
import org.eclipse.rdf4j.rio.RDFHandlerException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.eclipse.rdf4j.rio.RDFParser;
import org.eclipse.rdf4j.rio.RDFWriter;
import org.eclipse.rdf4j.rio.Rio;
import org.eclipse.rdf4j.rio.helpers.StatementCollector;

public class ConvertOntology {

public static void main(String args) throws RDFParseException, RDFHandlerException, IOException {
// TODO Auto-generated method stub

String file = "C:\Users\user\Desktop\fileA.rdf";

File initialFile = new File(file);
InputStream input = new FileInputStream(initialFile);


RDFParser parser = Rio.createParser(RDFFormat.RDFXML);
parser.setPreserveBNodeIDs(true);

Model model = new LinkedHashModel();
parser.setRDFHandler(new StatementCollector(model));
parser.parse(input, initialFile.getAbsolutePath());

FileOutputStream out = new FileOutputStream("C:\Users\user\Desktop\fileB.rdf");
RDFWriter writer = Rio.createWriter(RDFFormat.RDFXML, out);
try {
writer.startRDF();
for (Statement st: model) {

writer.handleStatement(st);
}
writer.endRDF();
}
catch (RDFHandlerException e) {
// oh no, do something!
}
finally {
out.close();
}
}

}


The code word fine for small file but with big file i get the following excpetion



JAXP00010001: The parser has encountered more than "64000" entity expansions in this document; this is the limit imposed by the JDK


In eclipse i run the project by click on run>>runconfiguration>>argument then set in VM argument -DentityExpansionLimit=1000000.i get the a new excpetion due to memory limit:



the Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded


so the max heap i can set is small to what the file needs. So I want to execute my code on a server. Usually i compile and run my maven on the server by:



mvn compile
mv exec:java


My question:
i set -DentityExpansionLimit=5000000 in maven by



mvn -DentityExpansionLimit=5000000 exec:java


but i get a original exception :



[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project rdf4j-getting-started: An exception occured while executing the Java class. null: InvocationTargetException: JAXP00010004: The accumulated size of entities is "50,000,018" that exceeded the "50,000,000" limit set by "FEATURE_SECURE_PROCESSING". [line 1, column 34] -> [Help 1]


how can solve this issue?










share|improve this question




















  • 1




    As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into a Model in memory.
    – Jeen Broekstra
    Nov 19 at 4:50










  • The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
    – Peter Lawrey
    Nov 19 at 7:32










  • @JeenBroekstra, can you please explain more your idea. Sure it will help in my case
    – bib
    Nov 19 at 15:09






  • 1




    @bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
    – Jeen Broekstra
    Nov 19 at 21:53















up vote
0
down vote

favorite












I have a maven project when i need to parse abig rdf file.



my code is :



import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;

import org.eclipse.rdf4j.model.Model;
import org.eclipse.rdf4j.model.Statement;
import org.eclipse.rdf4j.model.impl.LinkedHashModel;
import org.eclipse.rdf4j.rio.RDFFormat;
import org.eclipse.rdf4j.rio.RDFHandlerException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.eclipse.rdf4j.rio.RDFParser;
import org.eclipse.rdf4j.rio.RDFWriter;
import org.eclipse.rdf4j.rio.Rio;
import org.eclipse.rdf4j.rio.helpers.StatementCollector;

public class ConvertOntology {

public static void main(String args) throws RDFParseException, RDFHandlerException, IOException {
// TODO Auto-generated method stub

String file = "C:\Users\user\Desktop\fileA.rdf";

File initialFile = new File(file);
InputStream input = new FileInputStream(initialFile);


RDFParser parser = Rio.createParser(RDFFormat.RDFXML);
parser.setPreserveBNodeIDs(true);

Model model = new LinkedHashModel();
parser.setRDFHandler(new StatementCollector(model));
parser.parse(input, initialFile.getAbsolutePath());

FileOutputStream out = new FileOutputStream("C:\Users\user\Desktop\fileB.rdf");
RDFWriter writer = Rio.createWriter(RDFFormat.RDFXML, out);
try {
writer.startRDF();
for (Statement st: model) {

writer.handleStatement(st);
}
writer.endRDF();
}
catch (RDFHandlerException e) {
// oh no, do something!
}
finally {
out.close();
}
}

}


The code word fine for small file but with big file i get the following excpetion



JAXP00010001: The parser has encountered more than "64000" entity expansions in this document; this is the limit imposed by the JDK


In eclipse i run the project by click on run>>runconfiguration>>argument then set in VM argument -DentityExpansionLimit=1000000.i get the a new excpetion due to memory limit:



the Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded


so the max heap i can set is small to what the file needs. So I want to execute my code on a server. Usually i compile and run my maven on the server by:



mvn compile
mv exec:java


My question:
i set -DentityExpansionLimit=5000000 in maven by



mvn -DentityExpansionLimit=5000000 exec:java


but i get a original exception :



[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project rdf4j-getting-started: An exception occured while executing the Java class. null: InvocationTargetException: JAXP00010004: The accumulated size of entities is "50,000,018" that exceeded the "50,000,000" limit set by "FEATURE_SECURE_PROCESSING". [line 1, column 34] -> [Help 1]


how can solve this issue?










share|improve this question




















  • 1




    As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into a Model in memory.
    – Jeen Broekstra
    Nov 19 at 4:50










  • The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
    – Peter Lawrey
    Nov 19 at 7:32










  • @JeenBroekstra, can you please explain more your idea. Sure it will help in my case
    – bib
    Nov 19 at 15:09






  • 1




    @bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
    – Jeen Broekstra
    Nov 19 at 21:53













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I have a maven project when i need to parse abig rdf file.



my code is :



import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;

import org.eclipse.rdf4j.model.Model;
import org.eclipse.rdf4j.model.Statement;
import org.eclipse.rdf4j.model.impl.LinkedHashModel;
import org.eclipse.rdf4j.rio.RDFFormat;
import org.eclipse.rdf4j.rio.RDFHandlerException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.eclipse.rdf4j.rio.RDFParser;
import org.eclipse.rdf4j.rio.RDFWriter;
import org.eclipse.rdf4j.rio.Rio;
import org.eclipse.rdf4j.rio.helpers.StatementCollector;

public class ConvertOntology {

public static void main(String args) throws RDFParseException, RDFHandlerException, IOException {
// TODO Auto-generated method stub

String file = "C:\Users\user\Desktop\fileA.rdf";

File initialFile = new File(file);
InputStream input = new FileInputStream(initialFile);


RDFParser parser = Rio.createParser(RDFFormat.RDFXML);
parser.setPreserveBNodeIDs(true);

Model model = new LinkedHashModel();
parser.setRDFHandler(new StatementCollector(model));
parser.parse(input, initialFile.getAbsolutePath());

FileOutputStream out = new FileOutputStream("C:\Users\user\Desktop\fileB.rdf");
RDFWriter writer = Rio.createWriter(RDFFormat.RDFXML, out);
try {
writer.startRDF();
for (Statement st: model) {

writer.handleStatement(st);
}
writer.endRDF();
}
catch (RDFHandlerException e) {
// oh no, do something!
}
finally {
out.close();
}
}

}


The code word fine for small file but with big file i get the following excpetion



JAXP00010001: The parser has encountered more than "64000" entity expansions in this document; this is the limit imposed by the JDK


In eclipse i run the project by click on run>>runconfiguration>>argument then set in VM argument -DentityExpansionLimit=1000000.i get the a new excpetion due to memory limit:



the Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded


so the max heap i can set is small to what the file needs. So I want to execute my code on a server. Usually i compile and run my maven on the server by:



mvn compile
mv exec:java


My question:
i set -DentityExpansionLimit=5000000 in maven by



mvn -DentityExpansionLimit=5000000 exec:java


but i get a original exception :



[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project rdf4j-getting-started: An exception occured while executing the Java class. null: InvocationTargetException: JAXP00010004: The accumulated size of entities is "50,000,018" that exceeded the "50,000,000" limit set by "FEATURE_SECURE_PROCESSING". [line 1, column 34] -> [Help 1]


how can solve this issue?










share|improve this question















I have a maven project when i need to parse abig rdf file.



my code is :



import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;

import org.eclipse.rdf4j.model.Model;
import org.eclipse.rdf4j.model.Statement;
import org.eclipse.rdf4j.model.impl.LinkedHashModel;
import org.eclipse.rdf4j.rio.RDFFormat;
import org.eclipse.rdf4j.rio.RDFHandlerException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.eclipse.rdf4j.rio.RDFParser;
import org.eclipse.rdf4j.rio.RDFWriter;
import org.eclipse.rdf4j.rio.Rio;
import org.eclipse.rdf4j.rio.helpers.StatementCollector;

public class ConvertOntology {

public static void main(String args) throws RDFParseException, RDFHandlerException, IOException {
// TODO Auto-generated method stub

String file = "C:\Users\user\Desktop\fileA.rdf";

File initialFile = new File(file);
InputStream input = new FileInputStream(initialFile);


RDFParser parser = Rio.createParser(RDFFormat.RDFXML);
parser.setPreserveBNodeIDs(true);

Model model = new LinkedHashModel();
parser.setRDFHandler(new StatementCollector(model));
parser.parse(input, initialFile.getAbsolutePath());

FileOutputStream out = new FileOutputStream("C:\Users\user\Desktop\fileB.rdf");
RDFWriter writer = Rio.createWriter(RDFFormat.RDFXML, out);
try {
writer.startRDF();
for (Statement st: model) {

writer.handleStatement(st);
}
writer.endRDF();
}
catch (RDFHandlerException e) {
// oh no, do something!
}
finally {
out.close();
}
}

}


The code word fine for small file but with big file i get the following excpetion



JAXP00010001: The parser has encountered more than "64000" entity expansions in this document; this is the limit imposed by the JDK


In eclipse i run the project by click on run>>runconfiguration>>argument then set in VM argument -DentityExpansionLimit=1000000.i get the a new excpetion due to memory limit:



the Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded


so the max heap i can set is small to what the file needs. So I want to execute my code on a server. Usually i compile and run my maven on the server by:



mvn compile
mv exec:java


My question:
i set -DentityExpansionLimit=5000000 in maven by



mvn -DentityExpansionLimit=5000000 exec:java


but i get a original exception :



[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project rdf4j-getting-started: An exception occured while executing the Java class. null: InvocationTargetException: JAXP00010004: The accumulated size of entities is "50,000,018" that exceeded the "50,000,000" limit set by "FEATURE_SECURE_PROCESSING". [line 1, column 34] -> [Help 1]


how can solve this issue?







java maven jvm rdf4j






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 18 at 21:31

























asked Nov 18 at 20:03









bib

8410




8410








  • 1




    As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into a Model in memory.
    – Jeen Broekstra
    Nov 19 at 4:50










  • The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
    – Peter Lawrey
    Nov 19 at 7:32










  • @JeenBroekstra, can you please explain more your idea. Sure it will help in my case
    – bib
    Nov 19 at 15:09






  • 1




    @bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
    – Jeen Broekstra
    Nov 19 at 21:53














  • 1




    As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into a Model in memory.
    – Jeen Broekstra
    Nov 19 at 4:50










  • The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
    – Peter Lawrey
    Nov 19 at 7:32










  • @JeenBroekstra, can you please explain more your idea. Sure it will help in my case
    – bib
    Nov 19 at 15:09






  • 1




    @bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
    – Jeen Broekstra
    Nov 19 at 21:53








1




1




As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into a Model in memory.
– Jeen Broekstra
Nov 19 at 4:50




As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into a Model in memory.
– Jeen Broekstra
Nov 19 at 4:50












The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
– Peter Lawrey
Nov 19 at 7:32




The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
– Peter Lawrey
Nov 19 at 7:32












@JeenBroekstra, can you please explain more your idea. Sure it will help in my case
– bib
Nov 19 at 15:09




@JeenBroekstra, can you please explain more your idea. Sure it will help in my case
– bib
Nov 19 at 15:09




1




1




@bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
– Jeen Broekstra
Nov 19 at 21:53




@bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
– Jeen Broekstra
Nov 19 at 21:53












2 Answers
2






active

oldest

votes

















up vote
1
down vote













By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java i solved my issue. Hope that will help






share|improve this answer




























    up vote
    0
    down vote













    According to the documentation you can use a negative value to remove the limits.






    share|improve this answer





















    • I tried 0 but it is not working.
      – bib
      Nov 18 at 23:27










    • Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
      – Jason Armstrong
      Nov 18 at 23:31










    • locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
      – bib
      Nov 19 at 2:00













    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53364911%2fjaxp00010004-and-java-lang-outofmemoryerror-gc-overhead-limit-exceeded%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote













    By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java i solved my issue. Hope that will help






    share|improve this answer

























      up vote
      1
      down vote













      By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java i solved my issue. Hope that will help






      share|improve this answer























        up vote
        1
        down vote










        up vote
        1
        down vote









        By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java i solved my issue. Hope that will help






        share|improve this answer












        By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java i solved my issue. Hope that will help







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 19 at 15:00









        bib

        8410




        8410
























            up vote
            0
            down vote













            According to the documentation you can use a negative value to remove the limits.






            share|improve this answer





















            • I tried 0 but it is not working.
              – bib
              Nov 18 at 23:27










            • Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
              – Jason Armstrong
              Nov 18 at 23:31










            • locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
              – bib
              Nov 19 at 2:00

















            up vote
            0
            down vote













            According to the documentation you can use a negative value to remove the limits.






            share|improve this answer





















            • I tried 0 but it is not working.
              – bib
              Nov 18 at 23:27










            • Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
              – Jason Armstrong
              Nov 18 at 23:31










            • locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
              – bib
              Nov 19 at 2:00















            up vote
            0
            down vote










            up vote
            0
            down vote









            According to the documentation you can use a negative value to remove the limits.






            share|improve this answer












            According to the documentation you can use a negative value to remove the limits.







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Nov 18 at 23:12









            Jason Armstrong

            44717




            44717












            • I tried 0 but it is not working.
              – bib
              Nov 18 at 23:27










            • Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
              – Jason Armstrong
              Nov 18 at 23:31










            • locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
              – bib
              Nov 19 at 2:00




















            • I tried 0 but it is not working.
              – bib
              Nov 18 at 23:27










            • Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
              – Jason Armstrong
              Nov 18 at 23:31










            • locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
              – bib
              Nov 19 at 2:00


















            I tried 0 but it is not working.
            – bib
            Nov 18 at 23:27




            I tried 0 but it is not working.
            – bib
            Nov 18 at 23:27












            Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
            – Jason Armstrong
            Nov 18 at 23:31




            Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
            – Jason Armstrong
            Nov 18 at 23:31












            locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
            – bib
            Nov 19 at 2:00






            locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
            – bib
            Nov 19 at 2:00




















             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53364911%2fjaxp00010004-and-java-lang-outofmemoryerror-gc-overhead-limit-exceeded%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Costa Masnaga

            Fotorealismo

            Sidney Franklin