Java NIO: Ideal approach to high-speed sequentially streaming large binary data to a file

18 hours ago 3
ARTICLE AD BOX

I need to speed up a series of java apps which first produce a large amount of sequential data (binary) for latter (sequential) processing by a following app. I am looking at possibly using Java NIO, Apache Commons IO or any other library that simplifies this. Speed is crucial. Apps are single threaded. Files range from ~1 ... 10GB (probably much larger in future.) Data writes would be 2 bytes every aprox 1/10th to a couple of seconds.

E.g. this is basically very similar to logging, except it's binary and speed is critical.

Memory Mapped Files seem ideal for this, with a periodic (preferably automatic) flush. The trivial code examples I find just seem to raise more questions than answers, like

Just how do ByteBuffers work? Do I do create one for every write or fill a buffer and then do the write or what?

is ApacheCommons DeferredFileOutputStream a good choice? What about flushing the buffer at the end of the run

what other options might work better?

A pointer to a good tutorial or some good example would be very appreciated.

Read Entire Article