Note that there are some explanatory texts on larger screens.

plurals
  1. POCustom HashMap Code Issue
    primarykey
    data
    text
    <p>I have following code, where I used HashMap (using two parallel arrays) for storing key-value pairs (key can have multiple values). Now, I have to store and load it for future use that's why I store and load it by using File Channel. Issue with this code is: I can store nearly 120 millions of key-value pairs in my 8 GB server (actually, I can allocate nearly 5 gb out of 8 gb for my JVM, and those two parallel arrays takes nearly 2.5 gb, other memory are used for various processing of my code). But, I have to store nearly 600/700 millions of key-value pairs. Can anybdoy help me how to modify this code thus I can store nearly 600/700 millions of key-value pairs. Or any comment on this will be nice for me. Another point, I have to load and store the hashmap to/from memory. It takes little bit long time using file channel. As per various suggestions of Stack Overflow, I didn't find faster one. I have used ObjectOutputStream, Zipped output stream also, however, slower than below code. Is there anyway to store those two parallel arrays in such a way thus loading time will be much faster. I have given below in my code a test case. Any comment on that will also be helpful to me. </p> <pre><code>import java.io.*; import java.util.ArrayList; import java.util.Iterator; import java.util.Arrays; import java.util.Random; import java.nio.*; import java.nio.channels.FileChannel; import java.io.RandomAccessFile; public class Test { public static void main(String args[]) { try { Random randomGenerator = new Random(); LongIntParallelHashMultimap lph = new LongIntParallelHashMultimap(220000000, "xx.dat", "yy.dat"); for (int i = 0; i &lt; 110000000; i++) { lph.put(i, randomGenerator.nextInt(200000000)); } lph.save(); LongIntParallelHashMultimap lphN = new LongIntParallelHashMultimap(220000000, "xx.dat", "yy.dat"); lphN.load(); int tt[] = lphN.get(1); System.out.println(tt[0]); } catch (Exception e) { e.printStackTrace(); } } } class LongIntParallelHashMultimap { private static final long NULL = -1L; private final long[] keys; private final int[] values; private int size; private int savenum = 0; private String str1 = ""; private String str2 = ""; public LongIntParallelHashMultimap(int capacity, String st1, String st2) { keys = new long[capacity]; values = new int[capacity]; Arrays.fill(keys, NULL); savenum = capacity; str1 = st1; str2 = st2; } public void put(long key, int value) { int index = indexFor(key); while (keys[index] != NULL) { index = successor(index); } keys[index] = key; values[index] = value; ++size; } public int[] get(long key) { int index = indexFor(key); int count = countHits(key, index); int[] hits = new int[count]; int hitIndex = 0; while (keys[index] != NULL) { if (keys[index] == key) { hits[hitIndex] = values[index]; ++hitIndex; } index = successor(index); } return hits; } private int countHits(long key, int index) { int numHits = 0; while (keys[index] != NULL) { if (keys[index] == key) { ++numHits; } index = successor(index); } return numHits; } private int indexFor(long key) { return Math.abs((int) ((key * 5700357409661598721L) % keys.length)); } private int successor(int index) { return (index + 1) % keys.length; } public int size() { return size; } public void load() { try { FileChannel channel2 = new RandomAccessFile(str1, "r").getChannel(); MappedByteBuffer mbb2 = channel2.map(FileChannel.MapMode.READ_ONLY, 0, channel2.size()); mbb2.order(ByteOrder.nativeOrder()); assert mbb2.remaining() == savenum * 8; for (int i = 0; i &lt; savenum; i++) { long l = mbb2.getLong(); keys[i] = l; } channel2.close(); FileChannel channel3 = new RandomAccessFile(str2, "r").getChannel(); MappedByteBuffer mbb3 = channel3.map(FileChannel.MapMode.READ_ONLY, 0, channel3.size()); mbb3.order(ByteOrder.nativeOrder()); assert mbb3.remaining() == savenum * 4; for (int i = 0; i &lt; savenum; i++) { int l1 = mbb3.getInt(); values[i] = l1; } channel3.close(); } catch (Exception e) { System.out.println(e); } } public void save() { try { FileChannel channel = new RandomAccessFile(str1, "rw").getChannel(); MappedByteBuffer mbb = channel.map(FileChannel.MapMode.READ_WRITE, 0, savenum * 8); mbb.order(ByteOrder.nativeOrder()); for (int i = 0; i &lt; savenum; i++) { mbb.putLong(keys[i]); } channel.close(); FileChannel channel1 = new RandomAccessFile(str2, "rw").getChannel(); MappedByteBuffer mbb1 = channel1.map(FileChannel.MapMode.READ_WRITE, 0, savenum * 4); mbb1.order(ByteOrder.nativeOrder()); for (int i = 0; i &lt; savenum; i++) { mbb1.putInt(values[i]); } channel1.close(); } catch (Exception e) { System.out.println("IOException : " + e); } } } </code></pre>
    singulars
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload