Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Composer 133

Member
  • Content Count

    11
  • Joined

  • Last visited

Everything posted by Composer 133

  1. Possibly this one: . Aside from the water cooling stuff, he says: running at too high a temeprature will degrade the SSD over time; running at a median temperature is optimal speed-wise and longevity-wise; and running at too low a temperature is likely to be slow. Just as you'd expect really. Maybe we just need the $7 water-cooler and some hosepipe
  2. For this project, SATA is a non-starter, just too slow. We have to minimise (to an affordable extent) time lost due to inadequate hardware. (It would almost certainly have been better to do this work on PC, but sadly not an option). So we looked to M.2 -- and it turns out some of these do get very hot (a review of Samsung 970 Evo Plus reports 93 degrees at extreme load). And I guess Enterprise U.2 SSDs exist for a reason, no??
  3. Thank you LAwLz, appreciated. It’s not quite like that. In short: we’re populating the tables of a database; there are only 70 tables. The long routines are the process of populating (huge amount of calculation). The smallest table has only c.20,000 records; each table gradually gets bigger until the 70th and largest table which has c.200 million. The records are very small and compact, hence the (estimated) size of the TOTAL DB (all 70 tables together) is ‘only’ 1.2Tb. The tricky bit (and maybe it would be better if it didn’t, but it’s due to the history of the project and it’s har
  4. Hi Bombastinator. Thanks again for this. Must be a misunderstanding (As I say…) the Total DB (DataBase, ie. all 70 tables, the complete shebang) = 1.2Tb (=estimate) -- not that large. Backup is not an issue at all. The question in a nutshell (and I should have put it like this in the first place, my fault, my brain is fuzzy with it all) is: We need really fast read and write speeds, but we also need the drive not to overheat if left running for very long periods (weeks). Therefore, do we really need to buy an expensive Enterprise U.2 drive, or could we get away with a hig
  5. Two thoughts: 1. Are we kind-of asking an impossible question? Should we just buy the fastest SSD we can find—and try it? We were hoping to avoid that because the project already cost a lot, (it's a private project related to maths and music) and at this late stage in the process, if possible, we really want to avoid expensive mistakes! 2. Is it possible to send the question(s) to Linus's team? Thanks
  6. Thanks ageekhere. Well, in building one table, a subset of the records is generated in RAM (up to a certain number, or time), and is then written to disk, then (all in one continuous routine) another subset is generated, and written, and so on, until the table is complete. Since this Mac Pro could theoretically hold up to 768Gb RAM... maybe we should indeed revisit how this works, but we certainly can't afford quite that much RAM! Ultimately, it is necessary to save the complete table/DB file. It can't just be held in RAM, because (given our reseach questions) it's at that point th
  7. Thanks Bombastinator Luckily the files are not toooo big, and we have a whole bunch of external hard drives for use as back-up. I also send the files to a cloud storage place in the sky. What's irritating is when it's decided we need another field, or the main programmer decides a change in file format would be more efficient... and everything has to be regenerated from scratch. The joys of research. I'm no hardware expert (trying to learn...) but am more or less aware of the other parameters you mention. Regarding 'lag' issues, can you say anything which would help
  8. Hello Forum Which SSD would you recommend for a math research database application? All suggestions would be very gratefully received. Hardware: · Mac Pro 2019 / 16-core / 192 Gb RAM Database: · Software: XOJO with SQlite (flat tables, non-relational) / 70 tables in all / each table has only 7 columns (fields) · number of records in each table varies from c. 20,000 to 200,000,000 (two hundred million) · the larger tables take weeks/months of non-stop processing to generate · largest single table will be 100Gb (estimate)
  9. Thanks Nick Name. In Logic and Dorico not all the static bits are black... so I'm guessing not. I wonder how you find out if Burn In is a function of brightness?
  10. Thanks for the reply, Dizmo. No I'm not gaming, just looking for the highest quality I can get, any advantage to be nice to the eyes. Do you think there is zero advantage to be gained from, say, CG437? Various reviews seem to suggest it's a very good all-rounder. The Dell didn't get such good crits. One thing might be useful on the Dell though -- simultaneous inputs, and I can imagine a use for running my Mac laptop into it as well as the Mac Pro. When I contacted Acer about connecting CG437 to the Mac Pro I got a reply saying I'd need a 3rd party adapter, but they didn't sa
  11. Hello Forum, New member here. I need to buy a monitor—43” (or a little bigger) and 16:9. I’m a composer writing for orchestra and I want to see the whole score from Flutes to Double Basses without scrolling. The computer is a Mac Pro 2019 with an AMD Radeon Pro W5700X GPU. I use Apple Logic and Steinberg Dorico. Everything else is secondary. I'm in the UK. I want a monitor not a TV. However——I’m 63, I have cataracts and other eye problems which (for various reasons) may never be fixed. I want to take care of my eyes as long as possible… So I’m looking
×