hi
I hope u can help me and show me is the thing that is in my mind, does it have a routine solution or I have to do all of it by myself:
I want to work with 50 gigs of unique data(in 100 series of 250,000 numbers) and I want to add at least 200 new series from each of those 100 series(20000 series) approx 10,000 gigs then
create and train ANN on it, probably with python machine learning platforms(pytorch, tensorflow...)
how can I do it without having a storage no more 200 gigs(its an approx I mean less than 10000 gigs)
is it possible? or I should create a ann from scratch and manipulate it to which, each data evaluation(1 of 250,000 data evaluation) within each epoch just reads 100 number(1 from each 100 series) then creates 19900 other numbers then do it like 250,000 more to finish one epoch and do it other epochs till training of ann is finished