What's new

.NET Need help opening large files (VB)

  • Thread starter DarkGamer NZ
  • Start date
  • Views 209
D

DarkGamer NZ

Newbie
Messages
14
Reaction score
1
Ok so currently i have this code
Code:
Dim Input As New System.IO.FileStream(FileDir.Text, FileMode.Open)
Dim FileEncrypted As Long = 0
Dim FileSize As Long = Input.Length
Dim bytes(CInt(Input.Length - 1)) As Byte
Input.Read(bytes, 0, CInt(Input.Length))
Input.Flush()
Input.Close()
Input.Dispose()

and this works fine for files up to 100MB but i want to be able to use it for files that could be 100GB+
I have spent alot of time optimizing the code to process files as fast as possible but i have hit a wall when it comes to large files. I get this "System.OutOfMemoryException", i have seen people use the streamreader to read 1 line at a time but it doesn't work in my case as i am getting the actual bytes and not just text.

The line that gives this error is
Code:
Dim bytes(CInt(Input.Length - 1)) As Byte

I have thought about splitting the file into 10MB chuncks but i have no idea how to do that since i can not open it in the first place. Any help is appreciated.
 
Last edited:
A

AnonLobbies

האחד והיחיד
Messages
297
Reaction score
134
You are trying to read the entire file into memory at once which is not a good plan for such large files. You are not going to have enough memory in your process to do this. Even if you did it would be horribly slow. You're going to have to think about what you are trying to do and come up with a plan to look at the file in smaller chunks. As you say if you looked at it one line at a time you would only need enough memory for each line. Even 10MB is going to be a lot to load at once and will likely slow you down. For example there are a lot of code editors out there and very few that can handle massive files. The ones that do handle it load only smaller sections of the file at once.
 
D

DarkGamer NZ

Newbie
Messages
14
Reaction score
1
You are trying to read the entire file into memory at once which is not a good plan for such large files. You are not going to have enough memory in your process to do this. Even if you did it would be horribly slow. You're going to have to think about what you are trying to do and come up with a plan to look at the file in smaller chunks. As you say if you looked at it one line at a time you would only need enough memory for each line. Even 10MB is going to be a lot to load at once and will likely slow you down. For example there are a lot of code editors out there and very few that can handle massive files. The ones that do handle it load only smaller sections of the file at once.
I have started splitting files into 10MB or 20MB chuncks to be processed then reassembled. doing this isn't a slow task as everything happens out of sight and is optimized, the one other thing i have to do is multicore processing, Right now i am doing about 2.5MB/s but with 4 threads this would be almost 10MB/s
 
Top Bottom