java - OutOfMemoryError on tomcat7 -
i developing web-app takes zip file, uploaded user, unzips on server, , process files. works charm when zip file not large (20-25mb) if file or on (50mb), produces outofmemoryerror.
i have tried increase java maximum memory allocation pool adding export catalina_opts="-xmx1024m"
startup.sh in tomcat7, error still persists.
afaik, problem in unzipping .zip file. top
shows tomcat uses 800mb of memory during extraction of 50mb file. there solution, enable upto ~200mb uploads, whilst efficiently using available memory?
the code unzipping follows:
package user; import java.io.bufferedinputstream; import java.io.fileinputstream; import java.io.filenotfoundexception; import java.io.fileoutputstream; import java.io.ioexception; import java.util.zip.zipentry; import java.util.zip.zipinputstream; public class unzip { public void unzipfile(string filepath, string opath) { fileinputstream fis = null; zipinputstream zipis = null; zipentry zentry = null; try { fis = new fileinputstream(filepath); zipis = new zipinputstream(new bufferedinputstream(fis)); while((zentry = zipis.getnextentry()) != null){ try{ byte[] tmp = new byte[8*1024]; fileoutputstream fos = null; string opfilepath = opath+zentry.getname(); system.out.println("extracting file "+opfilepath); fos = new fileoutputstream(opfilepath); int size = 0; while((size = zipis.read(tmp)) != -1){ fos.write(tmp, 0 , size); } fos.flush(); fos.close(); }catch(exception ex){ } } zipis.close(); fis.close(); } catch (filenotfoundexception e) { // todo auto-generated catch block e.printstacktrace(); } catch (ioexception e) { // todo auto-generated catch block e.printstacktrace(); } } }
the error code follows:
http status 500 - javax.servlet.servletexception: java.lang.outofmemoryerror: java heap space type exception report message javax.servlet.servletexception: java.lang.outofmemoryerror: java heap space description server encountered internal error prevented fulfilling request. exception org.apache.jasper.jasperexception: javax.servlet.servletexception: java.lang.outofmemoryerror: java heap space org.apache.jasper.servlet.jspservletwrapper.handlejspexception(jspservletwrapper.java:549) org.apache.jasper.servlet.jspservletwrapper.service(jspservletwrapper.java:455) org.apache.jasper.servlet.jspservlet.servicejspfile(jspservlet.java:390) org.apache.jasper.servlet.jspservlet.service(jspservlet.java:334) javax.servlet.http.httpservlet.service(httpservlet.java:727) root cause javax.servlet.servletexception: java.lang.outofmemoryerror: java heap space org.apache.jasper.runtime.pagecontextimpl.dohandlepageexception(pagecontextimpl.java:916) org.apache.jasper.runtime.pagecontextimpl.handlepageexception(pagecontextimpl.java:845) org.apache.jsp.upload_jsp._jspservice(upload_jsp.java:369) org.apache.jasper.runtime.httpjspbase.service(httpjspbase.java:70) javax.servlet.http.httpservlet.service(httpservlet.java:727) org.apache.jasper.servlet.jspservletwrapper.service(jspservletwrapper.java:432) org.apache.jasper.servlet.jspservlet.servicejspfile(jspservlet.java:390) org.apache.jasper.servlet.jspservlet.service(jspservlet.java:334) javax.servlet.http.httpservlet.service(httpservlet.java:727) root cause java.lang.outofmemoryerror: java heap space org.apache.commons.io.output.bytearrayoutputstream.tobytearray(bytearrayoutputstream.java:322) org.apache.commons.io.output.deferredfileoutputstream.getdata(deferredfileoutputstream.java:213) org.apache.commons.fileupload.disk.diskfileitem.getsize(diskfileitem.java:289) org.apache.jsp.upload_jsp._jspservice(upload_jsp.java:159) org.apache.jasper.runtime.httpjspbase.service(httpjspbase.java:70) javax.servlet.http.httpservlet.service(httpservlet.java:727) org.apache.jasper.servlet.jspservletwrapper.service(jspservletwrapper.java:432) org.apache.jasper.servlet.jspservlet.servicejspfile(jspservlet.java:390) org.apache.jasper.servlet.jspservlet.service(jspservlet.java:334) javax.servlet.http.httpservlet.service(httpservlet.java:727) note full stack trace of root cause available in apache tomcat/7.0.52 (ubuntu) logs. apache tomcat/7.0.52 (ubuntu)
surprisingly, there nothing on catalina.out file regarding exception.
thanks in advance.
edit code diskfileitem in upload.jsp
//necessary imports go here file file ; int maxfilesize = 1000 * 1000 * 1024; int maxmemsize = 1000 * 1024; servletcontext context = pagecontext.getservletcontext(); string filepath = context.getinitparameter("file-upload"); string contenttype = request.getcontenttype(); if(contenttype != null) { if ((contenttype.indexof("multipart/form-data") >= 0)) { diskfileitemfactory factory = new diskfileitemfactory(); factory.setsizethreshold(maxmemsize); factory.setrepository(new file("/tmp/")); servletfileupload upload = new servletfileupload(factory); upload.setsizemax( maxfilesize ); try{ list fileitems = upload.parserequest(request); iterator = fileitems.iterator(); while (i.hasnext ()) { fileitem fi = (fileitem)i.next(); if ( !fi.isformfield () ) { string fieldname = fi.getfieldname(); string filename = fi.getname(); if(filename.endswith(".zip")||filename.endswith(".pdf")||filename.endswith(".doc")||filename.endswith(".docx")||filename.endswith(".ppt")||filename.endswith(".pptx")||filename.endswith(".html")||filename.endswith(".htm")||filename.endswith(".epub")||filename.endswith(".djvu")) { boolean isinmemory = fi.isinmemory(); long sizeinbytes = fi.getsize(); new file(filepath+filename).mkdir(); filepath = filepath+filename+"/"; file = new file( filepath + filename.substring( filename.lastindexof("/"))) ; fi.write(file); string fileextension = filenameutils.getextension(filename); if(fileextension.equals("zip")) { system.out.println("in zip."); unzip mfe = new unzip(); mfe.unzipfile(filepath+filename,filepath); file zip = new file(filepath+filename); zip.delete(); } file corepath = new file(filepath); int count=0; //some more processing } } } } catch(exception e) { //exception handling goes here } } }
the issue not in unzip code had posted. root couse in:
java.lang.outofmemoryerror: java heap space org.apache.commons.io.output.bytearrayoutputstream.tobytearray(bytearrayoutputstream.java:322) org.apache.commons.io.output.deferredfileoutputstream.getdata(deferredfileoutputstream.java:213) org.apache.commons.fileupload.disk.diskfileitem.getsize(diskfileitem.java:289)
do notice bytearrayoutputstream.tobytearray
? seems writing bytearrayoutputstream
grows much. please locate , post code uses bytearrayoutputstream
, zip code not use such thing
update: code you've posted seems code ok. fileitem.getsize()
call nasty things:
283 public long getsize() { 284 if (size >= 0) { 285 return size; 286 } else if (cachedcontent != null) { 287 return cachedcontent.length; 288 } else if (dfos.isinmemory()) { 289 return dfos.getdata().length; 290 } else { 291 return dfos.getfile().length(); 292 } 293 }
if file item's data stored in memory - calls getdata()
calls tobytearray()
209 public byte[] [more ...] getdata() 210 { 211 if (memoryoutputstream != null) 212 { 213 return memoryoutputstream.tobytearray(); 214 } 215 return null; 216 }
which in turn allocates new array:
317 public synchronized byte[] tobytearray() { 318 int remaining = count; 319 if (remaining == 0) { 320 return empty_byte_array; 321 } 322 byte newbuf[] = new byte[remaining]; //do stuff 333 return newbuf; 334 }
so short time have twice normal memory consumption.
i recommend to:
set
maxmemsize
no-more 8-32 kbgive more memory jvm process:
-xmx2g
examplemake sure not holding unnecessary references
fileitem
s in current configuration consume lot of memory.if oom happens again take heapdump. can use
-xx:+heapdumponoutofmemoryerror
jvm flag automatically create heapdump you. can use heap dump analyzer (for instance eclipse mat) check allocating memory , being allocated.
Comments
Post a Comment