JAXP00010004 and java.lang.OutOfMemoryError: GC overhead limit exceeded
up vote
0
down vote
favorite
I have a maven project when i need to parse abig rdf file.
my code is :
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import org.eclipse.rdf4j.model.Model;
import org.eclipse.rdf4j.model.Statement;
import org.eclipse.rdf4j.model.impl.LinkedHashModel;
import org.eclipse.rdf4j.rio.RDFFormat;
import org.eclipse.rdf4j.rio.RDFHandlerException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.eclipse.rdf4j.rio.RDFParser;
import org.eclipse.rdf4j.rio.RDFWriter;
import org.eclipse.rdf4j.rio.Rio;
import org.eclipse.rdf4j.rio.helpers.StatementCollector;
public class ConvertOntology {
public static void main(String args) throws RDFParseException, RDFHandlerException, IOException {
// TODO Auto-generated method stub
String file = "C:\Users\user\Desktop\fileA.rdf";
File initialFile = new File(file);
InputStream input = new FileInputStream(initialFile);
RDFParser parser = Rio.createParser(RDFFormat.RDFXML);
parser.setPreserveBNodeIDs(true);
Model model = new LinkedHashModel();
parser.setRDFHandler(new StatementCollector(model));
parser.parse(input, initialFile.getAbsolutePath());
FileOutputStream out = new FileOutputStream("C:\Users\user\Desktop\fileB.rdf");
RDFWriter writer = Rio.createWriter(RDFFormat.RDFXML, out);
try {
writer.startRDF();
for (Statement st: model) {
writer.handleStatement(st);
}
writer.endRDF();
}
catch (RDFHandlerException e) {
// oh no, do something!
}
finally {
out.close();
}
}
}
The code word fine for small file but with big file i get the following excpetion
JAXP00010001: The parser has encountered more than "64000" entity expansions in this document; this is the limit imposed by the JDK
In eclipse i run the project by click on run>>runconfiguration>>argument then set in VM argument -DentityExpansionLimit=1000000.i get the a new excpetion due to memory limit:
the Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
so the max heap i can set is small to what the file needs. So I want to execute my code on a server. Usually i compile and run my maven on the server by:
mvn compile
mv exec:java
My question:
i set -DentityExpansionLimit=5000000 in maven by
mvn -DentityExpansionLimit=5000000 exec:java
but i get a original exception :
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project rdf4j-getting-started: An exception occured while executing the Java class. null: InvocationTargetException: JAXP00010004: The accumulated size of entities is "50,000,018" that exceeded the "50,000,000" limit set by "FEATURE_SECURE_PROCESSING". [line 1, column 34] -> [Help 1]
how can solve this issue?
java maven jvm rdf4j
add a comment |
up vote
0
down vote
favorite
I have a maven project when i need to parse abig rdf file.
my code is :
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import org.eclipse.rdf4j.model.Model;
import org.eclipse.rdf4j.model.Statement;
import org.eclipse.rdf4j.model.impl.LinkedHashModel;
import org.eclipse.rdf4j.rio.RDFFormat;
import org.eclipse.rdf4j.rio.RDFHandlerException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.eclipse.rdf4j.rio.RDFParser;
import org.eclipse.rdf4j.rio.RDFWriter;
import org.eclipse.rdf4j.rio.Rio;
import org.eclipse.rdf4j.rio.helpers.StatementCollector;
public class ConvertOntology {
public static void main(String args) throws RDFParseException, RDFHandlerException, IOException {
// TODO Auto-generated method stub
String file = "C:\Users\user\Desktop\fileA.rdf";
File initialFile = new File(file);
InputStream input = new FileInputStream(initialFile);
RDFParser parser = Rio.createParser(RDFFormat.RDFXML);
parser.setPreserveBNodeIDs(true);
Model model = new LinkedHashModel();
parser.setRDFHandler(new StatementCollector(model));
parser.parse(input, initialFile.getAbsolutePath());
FileOutputStream out = new FileOutputStream("C:\Users\user\Desktop\fileB.rdf");
RDFWriter writer = Rio.createWriter(RDFFormat.RDFXML, out);
try {
writer.startRDF();
for (Statement st: model) {
writer.handleStatement(st);
}
writer.endRDF();
}
catch (RDFHandlerException e) {
// oh no, do something!
}
finally {
out.close();
}
}
}
The code word fine for small file but with big file i get the following excpetion
JAXP00010001: The parser has encountered more than "64000" entity expansions in this document; this is the limit imposed by the JDK
In eclipse i run the project by click on run>>runconfiguration>>argument then set in VM argument -DentityExpansionLimit=1000000.i get the a new excpetion due to memory limit:
the Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
so the max heap i can set is small to what the file needs. So I want to execute my code on a server. Usually i compile and run my maven on the server by:
mvn compile
mv exec:java
My question:
i set -DentityExpansionLimit=5000000 in maven by
mvn -DentityExpansionLimit=5000000 exec:java
but i get a original exception :
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project rdf4j-getting-started: An exception occured while executing the Java class. null: InvocationTargetException: JAXP00010004: The accumulated size of entities is "50,000,018" that exceeded the "50,000,000" limit set by "FEATURE_SECURE_PROCESSING". [line 1, column 34] -> [Help 1]
how can solve this issue?
java maven jvm rdf4j
1
As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into aModel
in memory.
– Jeen Broekstra
Nov 19 at 4:50
The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
– Peter Lawrey
Nov 19 at 7:32
@JeenBroekstra, can you please explain more your idea. Sure it will help in my case
– bib
Nov 19 at 15:09
1
@bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
– Jeen Broekstra
Nov 19 at 21:53
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have a maven project when i need to parse abig rdf file.
my code is :
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import org.eclipse.rdf4j.model.Model;
import org.eclipse.rdf4j.model.Statement;
import org.eclipse.rdf4j.model.impl.LinkedHashModel;
import org.eclipse.rdf4j.rio.RDFFormat;
import org.eclipse.rdf4j.rio.RDFHandlerException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.eclipse.rdf4j.rio.RDFParser;
import org.eclipse.rdf4j.rio.RDFWriter;
import org.eclipse.rdf4j.rio.Rio;
import org.eclipse.rdf4j.rio.helpers.StatementCollector;
public class ConvertOntology {
public static void main(String args) throws RDFParseException, RDFHandlerException, IOException {
// TODO Auto-generated method stub
String file = "C:\Users\user\Desktop\fileA.rdf";
File initialFile = new File(file);
InputStream input = new FileInputStream(initialFile);
RDFParser parser = Rio.createParser(RDFFormat.RDFXML);
parser.setPreserveBNodeIDs(true);
Model model = new LinkedHashModel();
parser.setRDFHandler(new StatementCollector(model));
parser.parse(input, initialFile.getAbsolutePath());
FileOutputStream out = new FileOutputStream("C:\Users\user\Desktop\fileB.rdf");
RDFWriter writer = Rio.createWriter(RDFFormat.RDFXML, out);
try {
writer.startRDF();
for (Statement st: model) {
writer.handleStatement(st);
}
writer.endRDF();
}
catch (RDFHandlerException e) {
// oh no, do something!
}
finally {
out.close();
}
}
}
The code word fine for small file but with big file i get the following excpetion
JAXP00010001: The parser has encountered more than "64000" entity expansions in this document; this is the limit imposed by the JDK
In eclipse i run the project by click on run>>runconfiguration>>argument then set in VM argument -DentityExpansionLimit=1000000.i get the a new excpetion due to memory limit:
the Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
so the max heap i can set is small to what the file needs. So I want to execute my code on a server. Usually i compile and run my maven on the server by:
mvn compile
mv exec:java
My question:
i set -DentityExpansionLimit=5000000 in maven by
mvn -DentityExpansionLimit=5000000 exec:java
but i get a original exception :
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project rdf4j-getting-started: An exception occured while executing the Java class. null: InvocationTargetException: JAXP00010004: The accumulated size of entities is "50,000,018" that exceeded the "50,000,000" limit set by "FEATURE_SECURE_PROCESSING". [line 1, column 34] -> [Help 1]
how can solve this issue?
java maven jvm rdf4j
I have a maven project when i need to parse abig rdf file.
my code is :
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import org.eclipse.rdf4j.model.Model;
import org.eclipse.rdf4j.model.Statement;
import org.eclipse.rdf4j.model.impl.LinkedHashModel;
import org.eclipse.rdf4j.rio.RDFFormat;
import org.eclipse.rdf4j.rio.RDFHandlerException;
import org.eclipse.rdf4j.rio.RDFParseException;
import org.eclipse.rdf4j.rio.RDFParser;
import org.eclipse.rdf4j.rio.RDFWriter;
import org.eclipse.rdf4j.rio.Rio;
import org.eclipse.rdf4j.rio.helpers.StatementCollector;
public class ConvertOntology {
public static void main(String args) throws RDFParseException, RDFHandlerException, IOException {
// TODO Auto-generated method stub
String file = "C:\Users\user\Desktop\fileA.rdf";
File initialFile = new File(file);
InputStream input = new FileInputStream(initialFile);
RDFParser parser = Rio.createParser(RDFFormat.RDFXML);
parser.setPreserveBNodeIDs(true);
Model model = new LinkedHashModel();
parser.setRDFHandler(new StatementCollector(model));
parser.parse(input, initialFile.getAbsolutePath());
FileOutputStream out = new FileOutputStream("C:\Users\user\Desktop\fileB.rdf");
RDFWriter writer = Rio.createWriter(RDFFormat.RDFXML, out);
try {
writer.startRDF();
for (Statement st: model) {
writer.handleStatement(st);
}
writer.endRDF();
}
catch (RDFHandlerException e) {
// oh no, do something!
}
finally {
out.close();
}
}
}
The code word fine for small file but with big file i get the following excpetion
JAXP00010001: The parser has encountered more than "64000" entity expansions in this document; this is the limit imposed by the JDK
In eclipse i run the project by click on run>>runconfiguration>>argument then set in VM argument -DentityExpansionLimit=1000000.i get the a new excpetion due to memory limit:
the Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
so the max heap i can set is small to what the file needs. So I want to execute my code on a server. Usually i compile and run my maven on the server by:
mvn compile
mv exec:java
My question:
i set -DentityExpansionLimit=5000000 in maven by
mvn -DentityExpansionLimit=5000000 exec:java
but i get a original exception :
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project rdf4j-getting-started: An exception occured while executing the Java class. null: InvocationTargetException: JAXP00010004: The accumulated size of entities is "50,000,018" that exceeded the "50,000,000" limit set by "FEATURE_SECURE_PROCESSING". [line 1, column 34] -> [Help 1]
how can solve this issue?
java maven jvm rdf4j
java maven jvm rdf4j
edited Nov 18 at 21:31
asked Nov 18 at 20:03
bib
8410
8410
1
As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into aModel
in memory.
– Jeen Broekstra
Nov 19 at 4:50
The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
– Peter Lawrey
Nov 19 at 7:32
@JeenBroekstra, can you please explain more your idea. Sure it will help in my case
– bib
Nov 19 at 15:09
1
@bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
– Jeen Broekstra
Nov 19 at 21:53
add a comment |
1
As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into aModel
in memory.
– Jeen Broekstra
Nov 19 at 4:50
The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
– Peter Lawrey
Nov 19 at 7:32
@JeenBroekstra, can you please explain more your idea. Sure it will help in my case
– bib
Nov 19 at 15:09
1
@bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
– Jeen Broekstra
Nov 19 at 21:53
1
1
As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into a
Model
in memory.– Jeen Broekstra
Nov 19 at 4:50
As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into a
Model
in memory.– Jeen Broekstra
Nov 19 at 4:50
The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
– Peter Lawrey
Nov 19 at 7:32
The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
– Peter Lawrey
Nov 19 at 7:32
@JeenBroekstra, can you please explain more your idea. Sure it will help in my case
– bib
Nov 19 at 15:09
@JeenBroekstra, can you please explain more your idea. Sure it will help in my case
– bib
Nov 19 at 15:09
1
1
@bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
– Jeen Broekstra
Nov 19 at 21:53
@bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
– Jeen Broekstra
Nov 19 at 21:53
add a comment |
2 Answers
2
active
oldest
votes
up vote
1
down vote
By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java
i solved my issue. Hope that will help
add a comment |
up vote
0
down vote
According to the documentation you can use a negative value to remove the limits.
I tried 0 but it is not working.
– bib
Nov 18 at 23:27
Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
– Jason Armstrong
Nov 18 at 23:31
locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
– bib
Nov 19 at 2:00
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java
i solved my issue. Hope that will help
add a comment |
up vote
1
down vote
By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java
i solved my issue. Hope that will help
add a comment |
up vote
1
down vote
up vote
1
down vote
By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java
i solved my issue. Hope that will help
By using mvn -Djdk.xml.totalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java
i solved my issue. Hope that will help
answered Nov 19 at 15:00
bib
8410
8410
add a comment |
add a comment |
up vote
0
down vote
According to the documentation you can use a negative value to remove the limits.
I tried 0 but it is not working.
– bib
Nov 18 at 23:27
Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
– Jason Armstrong
Nov 18 at 23:31
locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
– bib
Nov 19 at 2:00
add a comment |
up vote
0
down vote
According to the documentation you can use a negative value to remove the limits.
I tried 0 but it is not working.
– bib
Nov 18 at 23:27
Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
– Jason Armstrong
Nov 18 at 23:31
locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
– bib
Nov 19 at 2:00
add a comment |
up vote
0
down vote
up vote
0
down vote
According to the documentation you can use a negative value to remove the limits.
According to the documentation you can use a negative value to remove the limits.
answered Nov 18 at 23:12
Jason Armstrong
44717
44717
I tried 0 but it is not working.
– bib
Nov 18 at 23:27
Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
– Jason Armstrong
Nov 18 at 23:31
locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
– bib
Nov 19 at 2:00
add a comment |
I tried 0 but it is not working.
– bib
Nov 18 at 23:27
Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
– Jason Armstrong
Nov 18 at 23:31
locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
– bib
Nov 19 at 2:00
I tried 0 but it is not working.
– bib
Nov 18 at 23:27
I tried 0 but it is not working.
– bib
Nov 18 at 23:27
Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
– Jason Armstrong
Nov 18 at 23:31
Can you define not working? Not working on your server or locally? This likely won’t fix anything in a memory constrained environment.
– Jason Armstrong
Nov 18 at 23:31
locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
– bib
Nov 19 at 2:00
locally i cant test because my limited pc resources. On the server i test with the following command mvn -DtotalEntitySizeLimit=0 -DentityExpansionLimit=0 exec:java and i get the last exception
– bib
Nov 19 at 2:00
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53364911%2fjaxp00010004-and-java-lang-outofmemoryerror-gc-overhead-limit-exceeded%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
As an aside: if your code is meant to process a very big file, consider using streaming processing instead of reading the entire file into a
Model
in memory.– Jeen Broekstra
Nov 19 at 4:50
The default memory limit is 1/4 of main memory. You could try setting it to 80% of main memory.
– Peter Lawrey
Nov 19 at 7:32
@JeenBroekstra, can you please explain more your idea. Sure it will help in my case
– bib
Nov 19 at 15:09
1
@bib it's explained in the rdf4j docs. See docs.rdf4j.org/programming/#_writing_rdf
– Jeen Broekstra
Nov 19 at 21:53