Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
jaysangwan32

NVIDIA Titan V Benchmarks

Recommended Posts

On 12/12/2017 at 9:05 PM, Sierra Fox said:

tell your friend "Sierra Fox from the LinusTechTips forum says you're an idiot"

 

On 12/12/2017 at 9:20 PM, Not_Sean said:

 

OO Tell him Not_Sean Agrees with Sierra Fox 

 

(I don't know whats happening but I'm not missing a chance to call someone an idiot :P:P ) 

 

2 hours ago, VegetableStu said:

so shallow learning then

Eh I don't care, when he gets it I'll be sli'ing my 1080ti ftw3 for only 400 bucks. Funny thing is the Titan V is only 8-18% faster then a stock 1080ti so sli'ing my ftw3 1080ti would be far better for gaming which what he'll be using it for. His loss my gain.


CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to post
Share on other sites
On 11/12/2017 at 6:19 AM, BuckGup said:

Yeah usually people just bought tons of GPUS for cuda cores but tensor cores are like 100X the speed and you get over 600 of them

Only in very very very focused use cases. Basically, put you cant address each core with a separate set of instructions to run you can only address all of them/a large group of them. and then tell them to do the same operation on the data. This is great for big matrix operations that you repeat and repeat over lots and lots of input data but does not work if you need to adjust hat operations each time around (as re-setting the op is slow) 

so even in the neural network space Tensor cores are good for:

* image classification: you run 1000000 images through the same network and matrix filters compute the errors make adjustment repeat
* cluster anaisist: as above lots and lots of runs with different input data

what it does not work with at all is if you need to adjust the network on each new bit of data.

the key example of this is any sequence data (eg forecasting) were the neural network uses a memory within itself and thus each time you put in new data it behaves a little differently based on the last few runs. This is known as a recurrent network. Commonly used for:

* financial forecasting
* voice/audio to text/meaning processors (could also be systems to detect birdsong of an in danger bird so that it can be tracked over large spaces for example)
* flash flood forecasting/other unltra local weather condition forecasting, these take as input the current weather + the forecast for weather in related regions and use a recurrent neural network to predict the response pattern within local rivers to detect if there will be a sudden flash flood.

 

Link to post
Share on other sites
1 hour ago, hishnash said:

Only in very very very focused use cases. Basically, put you cant address each core with a separate set of instructions to run you can only address all of them/a large group of them. and then tell them to do the same operation on the data. This is great for big matrix operations that you repeat and repeat over lots and lots of input data but does not work if you need to adjust hat operations each time around (as re-setting the op is slow) 

so even in the neural network space Tensor cores are good for:

* image classification: you run 1000000 images through the same network and matrix filters compute the errors make adjustment repeat
* cluster anaisist: as above lots and lots of runs with different input data

what it does not work with at all is if you need to adjust the network on each new bit of data.

the key example of this is any sequence data (eg forecasting) were the neural network uses a memory within itself and thus each time you put in new data it behaves a little differently based on the last few runs. This is known as a recurrent network. Commonly used for:

* financial forecasting
* voice/audio to text/meaning processors (could also be systems to detect birdsong of an in danger bird so that it can be tracked over large spaces for example)
* flash flood forecasting/other unltra local weather condition forecasting, these take as input the current weather + the forecast for weather in related regions and use a recurrent neural network to predict the response pattern within local rivers to detect if there will be a sudden flash flood.

 

Correct, but the whispering through the grapevine is Nvidia's next-gen tensor flow processor will also have the neurotrophic capability you describe.

Link to post
Share on other sites
13 hours ago, Bit_Guardian said:

Correct, but the whispering through the grapevine is Nvidia's next-gen tensor flow processor will also have the neurotrophic capability you describe.

Given that others (Intel & IBM etc) already have chipsets with these features in place yes I would expect as much.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×