Issue
I'm getting a memory error when trying to split a very big string.
data = load_data(file_name) # loads data string from file
splited_data = data.split('\n\n')
why is it and how it can be fixed? working with python 2.7
Solution
The function load_data
is reading the entire file into memory and it is clear you don't have enough memory to do that. So you will have to abandon the idea of having a read phase followed by a processing phase. Instead, read your file a line at a time, and process the lines as you get them.
This will split your file into strings in the same way as data.split('\n\n')
but one line at a time:
with open("mybigfile.txt", "r") as f:
for line in f:
mydata = line.rstrip()
if mydata:
do_something_with(mydata)
Answered By - BoarGules
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.