Note that there are some explanatory texts on larger screens.

plurals
  1. POProcessing a large file in .NET
    text
    copied!<h2>The Problem</h2> <p>I need to be able to save and read a very big data structure using C#. The structure itself is rather simple; it's a very long array of a simple structs of a constant size.</p> <p>Just an example for clarity:</p> <pre><code>struct st { UInt32 a; UInt16 b; //etc. } completion ports st[] data = new st[1024*1024*100] </code></pre> <p>I want to be able to save and load these to files as fast and efficient as possible.</p> <h2>General Direction</h2> <p>My idea so far, is to cut the data into segments, conceptually of course, assign those segments to tasks and just write them into the file asynchronously. FileStream.WriteAsync appears to be perfect for this.</p> <p>My problem is with the reading, From the FileStream.ReadAsync API it seems completely reasonable that the results can be cut in the middle of each structure, halfway across a primitive in fact. Of course I can work around this, but I'm not sure what would be the best way, and how much will I interfere with the OS's buffering mechanism.</p> <p>Eventually I plan to create a MemoryStream from each buffer with <code>MemoryStream.MemoryStream(byte[])</code> and read each into the struct's with a binary reader.</p> <h2>The Question</h2> <p>So what would be the best way to solve this? Is my direction good? Are there any better solutions? Code examples and links would be appreciated...</p> <h2>Conclusions</h2> <p>After doing performance testing I found that reading a file with BinaryReader, or using multiple readers with FileStream.ReadAsync, gives approximately the same performance.</p> <p>Soo.... the question is pointless.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload