Jump to content

farhang amaji

Member
  • Posts

    11
  • Joined

  • Last visited

Awards

This user doesn't have any awards

farhang amaji's Achievements

  1. no I don't and I said after installing everything is fine but after restart it gets back to memory reserve again, I fixed it once then 2days later same thing happened
  2. Hi I bought new ram and added it to memory slot (new ram's brand is different than the other ram already installed and they both have same size of 8GB) so I went to repair it twice (for simple task of putting the rams to their slots) and every time it gets fixed and there task manager shows my installed memory correctly 16GB but everytime I get home again memory installed is 8GB and hardware reserve memory is 8GB and on the boot the motherboard buzzes and the ram led turns on then they stop buzzing and light goes off and it boots with 8GB ram. what's the problem?
  3. first I am not sure about size of 100 sets of 250,000 numbers and 50 gig is only example in this whole text second 10,000 GB is not 1000 TB, its 10TB
  4. hi I hope u can help me and show me is the thing that is in my mind, does it have a routine solution or I have to do all of it by myself: I want to work with 50 gigs of unique data(in 100 series of 250,000 numbers) and I want to add at least 200 new series from each of those 100 series(20000 series) approx 10,000 gigs then create and train ANN on it, probably with python machine learning platforms(pytorch, tensorflow...) how can I do it without having a storage no more 200 gigs(its an approx I mean less than 10000 gigs) is it possible? or I should create a ann from scratch and manipulate it to which, each data evaluation(1 of 250,000 data evaluation) within each epoch just reads 100 number(1 from each 100 series) then creates 19900 other numbers then do it like 250,000 more to finish one epoch and do it other epochs till training of ann is finished
  5. it may need to write the results of steps in out put file so probably it won't be just at the start for associating with hard drive main question is I can't understand will it need to write the results of steps in out put file so making hdd bottlenecked. I'm don't understand how I can figure it out? just by monitoring task manager or do you know other way of understanding will it write out the results of steps in out put file??
  6. will it give error for low ram???I thought anyway ram stores overflow on windows drive temporarily, so no error should ever occur in just any software due to low ram, except proportionally high ram over flow . and why it just had error of limitation(if I m right about the picture)for 2gb of ram used by matlab in 32gb memory system??? and I didn't say hdd capacity, i meant hdd speed of writing or reading. and i didnt understand how did you conclude from pic that this doesn't have anything of read&write speed of hdd?
  7. first why do you think it doesn't care for speed limiting of hard drives? second it's hard for me to uninstall abaqus from ssd, to again install it again on hdd just for the sake of taking a benchmark.
  8. hi my question is, for computational purpose softwares like abaqus or matlab for huge models with lots of ram installed on(more than memory gets full) will the hdd or non optane ssd be still the bottleneck of speed, of course comparing to optane?or just say, will lots of ram remove the need of storing data in hard drive (of course except the end)in general(if you acknowledge me about specific information on abaqus or matlab would be appreciated)?
  9. but anyway on a same task would it give you a faster answer for cpu computings like abaqus?
  10. first thanks to all who answered question but main question is is it really the best way of having drive with fastest speed possible?how fast it can push to transfer data?
  11. what is HPE LOGICAL VOLUME on https://www.harddrivebenchmark.net/high_end_drives.html ?
×