Jump to content

How good is a GTX 1050 Ti at running code?

CyanideDaydreams
Go to solution Solved by Sauron,
7 minutes ago, CyanideDaydreams said:

Its an AI that creates photos with photos.

If you're using CUDA acceleration then you can benefit from the 1050ti, however it won't be very fast. Whether it's good enough will depend on your expectations...

47 minutes ago, CyanideDaydreams said:

How good is a GTX 1050 Ti at running code 24/7? 

 

pycuda.etc

 

It wasn't designed to run for 24/7 but it should work just fine. Fan(s) are more likely to fail but not expensive to replace.

ಠ_ಠ

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, CyanideDaydreams said:

python, javascript, C++

those are languages, not code. what your code does is what matters here. it could range from being fine, to being slow, to being unnecessary depending on what your program uses it for.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, Sauron said:

those are languages, not code. what your code does is what matters here. it could range from being fine, to being slow, to being unnecessary depending on what your program uses it for.

Its an AI that creates photos with photos.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, CyanideDaydreams said:

Its an AI that creates photos with photos.

If you're using CUDA acceleration then you can benefit from the 1050ti, however it won't be very fast. Whether it's good enough will depend on your expectations...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

I had CUDA run 24/7 on a 1060 for about a year and half without any issue. The card was far from running even near 100% but it was stable. Only time it really turned off was during power outage and that's it.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm assuming you are starting off. You might have a better experience training your model if you are using a cloud ML service, like Cloud Tpu. It's probably much more efficient at doing the same work your GPU can do (i.e. it could probably do what your GPU could do in a month in a few minutes.) Not sure if that is helpful, but don't want you to be painted in a corner when you eventually upgrade only to find an incremental difference in ML performance.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×