Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

GTX 1050 Ti ODDITY Edition

Bobobert
 Share

Hello there. Just opening this thread for us to investigate this card. The card is a Gigabyte GTX 1050 Ti that I use for my esports rig, it's paired with a Ryzen 2600. The funny thing is the clocks it reaches by default. Since I bought it I haven't changed the bios or any setting but the card in every sensor software reaches 1720 MHz while using the 3D.
By the temperatures it reaches doing it at 3DMark; 65°C, one could said I got lucky with the gpu lottery. But, Why does the clock goes up to 1700MHz without any configuration? Or is this a trick from the vendor from the firmware? 
I though it will be interesting to discuss it, and maybe run more test to see if it's legitimate running at the shown speed.

1050_1.gif

1050_2.gif

oddity.png

Link to comment
Share on other sites

Link to post
Share on other sites

It's called GPU boost. If there's power and thermal overhead, it boosts automatically. AMD does it too, and it's pretty great since it just gives you a free performance boost even if you don't know how or don't care to overclock.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k (won) - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

Linux Proliant ML150 G6:

Dual Xeon X5560 - 24GB ECC DDR3 - GTX 750 TI - old Seagate 1.5TB HDD - dark mode Ubuntu (and Win7, cuz why not)

 

How many watts do I need? Seasonic Focus thread, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Bobobert said:

Hello there. Just opening this thread for us to investigate this card. The card is a Gigabyte GTX 1050 Ti that I use for my esports rig, it's paired with a Ryzen 2600. The funny thing is the clocks it reaches by default. Since I bought it I haven't changed the bios or any setting but the card in every sensor software reaches 1720 MHz while using the 3D.
By the temperatures it reaches doing it at 3DMark; 65°C, one could said I got lucky with the gpu lottery. But, Why does the clock goes up to 1700MHz without any configuration? Or is this a trick from the vendor from the firmware? 
I though it will be interesting to discuss it, and maybe run more test to see if it's legitimate running at the shown speed.

1050_1.gif

1050_2.gif

oddity.png

If in the nVidia panel the power magenment is set to optimal power, then in windows your speeds get underclocked to save power and wattage usage.  As soon as it detects some D3D or a game it clocks it accordingly.  As far as clock speeds Im not sure if that gigabyte card you got is OCed out of the box.  Actually 1700Mhz is low for core speed.  You should be able to do 2000Mhz if Im not mistaken.  but OCing the VRAM is where the performance gains will really come up.

Asus Sabertooth x79 / 4930k @ 4500 @ 1.408v / Gigabyte WF 2080 RTX / Corsair VG 64GB @ 1866 & AX1600i & H115i Pro @ 2x Noctua NF-A14 / Carbide 330r Blackout

Scarlett 2i2 Audio Interface / KRK Rokits 10" / Sennheiser HD 650 / Logitech G Pro Wireless Mouse & G915 Linear & G935 & C920 / SL 88 Grand / Cakewalk / NF-A14 Int P12 Ex
AOC 40" 4k Curved / LG 55" OLED C9 120hz / LaCie Porsche Design 2TB & 500GB / Samsung 950 Pro 500GB / 850 Pro 500GB / Crucial m4 500GB / Asus M.2 Card

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia GPU boost. Most cards run 200-300MHz higher than their factory base clocks when it steps in, but there are some I've seen that happily does 400MHz extra.

 

It is quite temperature sensitive though, you lose frequency as it gets hot.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: 1TB HP EX920 PCIe x4 M.2 SSD + 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172), 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×