0So I have over 30 million objects that I need to use as my training data. My Issue is simple: When I create my training array by an iterative
appending process, at a certain threshold, the list becomes too large and python gets killed. What is a way to get around this? I have been trying to figure this out for hours and keep coming up short!
Code example for creating training array
training_array =  for ...: data = #load data from somewhere data_array = [x for x in data] #some large array, 2-3 million objects for item in data_array: training_array.append(item.a + item.b)