Too much memory while reading a dictionary file in Java -


i read dictionary might 100mb or in size (sometimes gets bigger max 500mb). simple dictionary of 2 columns, first column words second column float value. read dictionary file in way:

bufferedreader br = new bufferedreader(new filereader(file));         string line;         while((line = br.readline()) != null) {             string[] cols = line.split("\t");             setit(cols[0], cols[1]); 

and setit function:

public void setit(string term, string value) {     all.put(term, new double(value));     } 

when have big file, takes long time load , goes out of memory. reasonable size file (100mb) need 4gb memory in java run.

any clue how improve while not changing structure of whole package?

edit: i'm using 50mb file -xmx1g , still error.

update: there iterations on file fixed them , memory problem partially solved. yet try properties , other solutions , report on that.

you allocating new string every line. there overhead associated string. see here calculation. this article addresses subject of object memory use in java.

there stack overflow question on subject of more memory efficient replacements strings here.

is there can avoid allocations? example, there limited number of strings represent integer in data structure, , use smaller lookup table translate?


Comments

Popular posts from this blog

c++ - Difference between pre and post decrement in recursive function argument -

php - Nothing but 'run(); ' when browsing to my local project, how do I fix this? -

php - How can I echo out this array? -