Jump to content

AMD Announces 'FreeSync 2'

HKZeroFive
8 minutes ago, MageTank said:

Sounds kinky...

It is summer in OZ.  Surprising the things one has to endure to look fabulous in a bikini.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Wix said:

Why is everyone finding the name ridiculous?

 

It's just sorta like HDMI 1.3, 2.0 or Bluetooth 4.1 etc.

Because AMD, so it's perfectly valid...

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Wix said:

Why is everyone finding the name ridiculous?

 

It's just sorta like HDMI 1.3, 2.0 or Bluetooth 4.1 etc.

 

Only resident AMD haters find the name ridiculous. I guess they have nothing better to critizice AMD for, now they are actually about to release some pretty good hardware.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Notional said:

Only resident AMD haters find the name ridiculous. I guess they have nothing better to critizice AMD for, now they are actually about to release some pretty good hardware.

No doubt the one thing that nVidia and Intel will be better at, will suddenly become the most important thing in the world, even though six months ago nobody gave a shit about it (like how power consumption and heat output suddenly became important after nearly 3 years of thermi).

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Fetzie said:

No doubt the one thing that nVidia and Intel will be better at, will suddenly become the most important thing in the world, even though six months ago nobody gave a shit about it (like how power consumption and heat output suddenly became important after nearly 3 years of thermi).

 

Yup, who cares? It's not even thermal limits that are the issues in C/GPU's atm anyways. Why not just support whatever company that is currently doing the best fo the consumers?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Notional said:

Yup, who cares? It's not even thermal limits that are the issues in C/GPU's atm anyways. Why not just support whatever company that is currently doing the best fo the consumers?

Because at 300W for a single-GPU card, thermals and power were the limiting factors in a build. You suddenly needed a 1000W PSU just for 2-card XFire. That's ridiculous.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, patrickjp93 said:

Because at 300W for a single-GPU card, thermals and power were the limiting factors in a build. You suddenly needed a 1000W PSU just for 2-card XFire. That's ridiculous.

 

Xfire or SLI is ridiculous at this point in general. Afaik, no 14/16nm GPU runs at 300w tdp. Only NVidia's Titan X(P) cards are close at about 250w tdp.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SamStrecker said:

What is the deal with HDR? It looks fake and over processed. It is cool if you take an HDR photo with your DSLR at dusk but all these videos with it look fake. 

it's partly because HDR needs to be experienced in-person. i believe the 4k Ultra blu-ray LTT video thread goes into greater detail 

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Notional said:

Xfire or SLI is ridiculous at this point in general. Afaik, no 14/16nm GPU runs at 300w tdp. Only NVidia's Titan X(P) cards are close at about 250w tdp.

Vega flagship not know yet, but knowing AMD's Fury X and 290X, it will happen again.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, patrickjp93 said:

Vega flagship not know yet, but knowing AMD's Fury X and 290X, it will happen again.

As long as perf/tdp is equal to, or better than, NVidia's high end cards, it really doesn't matter.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Fetzie said:

Although that firmware update isn't necessarily user-serviceable, so you'd have to send your monitor to Acer/LG/Samsung/whoever which could be a multiple week round trip :)

To add to that the existing G-sync chip may not have the performance to add said feature.

 

And about the mention of latency being a better design choice to use G-sync chip over a GPU that is pure nonsense, not only is a GPU vastly more powerful you can integrate certain things in to render pipelines earlier. Plus as the above point you don't have to replace your monitor to gain said features and a GPU replacement is far more common.

 

Doing things at the source is always better than after, if your bath is getting full you turn the tap off not get a bucket and start bailing :P.

Link to comment
Share on other sites

Link to post
Share on other sites

People think "Freesync 2" is a bad name? Titan X being used again is an even worst name.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

To add to that the existing G-sync chip may not have the performance to add said feature.

 

And about the mention of latency being a better design choice to use G-sync chip over a GPU that is pure nonsense, not only is a GPU vastly more powerful you can integrate certain things in to render pipelines earlier. Plus as the above point you don't have to replace your monitor to gain said features and a GPU replacement is far more common.

 

Doing things at the source is always better than after, if your bath is getting full you turn the tap off not get a bucket and start bailing :P.

To add to this: It's not like it matters too much anyways. Current panels likely won't support HDR if they don't already, and if they need both HDR and G-Sync in the future, a new module will either be made to accommodate it, or the current one will be updated (if possible). If it costs more, so be it. People already pay a premium for G-Sync, and if they want HDR, they will chalk it up as an additional premium.

 

I personally wouldn't pay the premium, but I suppose it's subjective. I am still hoping for a world where Nvidia decides "Hey, let's allow the consumer to choose between using our G-Sync modules, or the standard built-in adaptive sync widely available in the spec itself!". Boy, what a world that would be. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

freesync 2 pro +

CPU — AMD Ryzen 7800X3D

GPU — AMD RX 7900 XTX - XFX Speedster Merc 310 Black Edition - 24GB GDDR6

Monitor — Acer Predator XB271HU - 2560x1440 165Hz IPS 4ms

CPU Cooler — Noctua NH-D15

Motherboard — Gigabyte B650 GAMING X AX V2

Memory — 32GB G.Skill Flare X5 - 6000mHz CL32

Storage — WD Black - 2TB HDD

        — Seagate SkyHawk - 2TB HDD

        — Samsung 850 EVO - 250GB SSD

        — WD Blue - 500GB M.2 SSD

        — Samsung 990 PRO w/HS - 4TB M.2 SSD

Case — Fractal Design Define R6 TG

PSU — EVGA SuperNOVA G3 - 850W 80+ Gold 

Case Fans — 2(120mm) Noctua NF-F12 PWM - exhaust

          — 3(140mm) Noctua NF-A14 PWM - intake

Keyboard — Max Keyboard TKL Blackbird - Cherry MX blue switches - Red Backlighting 

Mouse — Logitech G PRO X

Headphones — Sennheiser HD600

Extras — Glorious PC Gaming Race - Mouse Wrist Rest  

       — Glorious PC Gaming Race - XXL Extended Mouse Pad - 36" x 18"

       — Max Keyboard Flacon-20 keypad - Cherry MX blue switches

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync 2 : Electric Boogaloo 

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Notional said:

As long as perf/tdp is equal to, or better than, NVidia's high end cards, it really doesn't matter.

Too bad it isn't and pretty much hasn't been since Fermi.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, patrickjp93 said:

Because at 300W for a single-GPU card, thermals and power were the limiting factors in a build. You suddenly needed a 1000W PSU just for 2-card XFire. That's ridiculous.

My dual 290X + 4930k has never used over 800W, not even during a furmark + prime96, and a hot running GPU is fine so long is it doesn't thermal throttle reducing performance. If there was a card released tomorrow that used 600W, was $600-800 NZD and was more than 4 times as fast as my dual 290X's I would buy it.

 

And unless you're doing a small case build thermal output of components doesn't mean a thing. Not once have a I ever looked at power draw or thermals when looking for a GPU, performance and price which I treat equally.

Link to comment
Share on other sites

Link to post
Share on other sites

So it's Freesync but now they also have HDR. So why not FreesyncHDR?

Or even better if they only care about marketing, why not FreesyncRGB?

You can sell anything right now, provided of course it has RGB.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

My dual 290X + 4930k has never used over 800W, not even during a furmark + prime96, and a hot running GPU is fine so long is it doesn't thermal throttle reducing performance. If there was a card released tomorrow that used 600W, was $600-800 NZD and was more than 4 times as fast as my dual 290X's I would buy it.

 

And unless you're doing a small case build thermal output of components doesn't mean a thing. Not once have a I ever looked at power draw or thermals when looking for a GPU, performance and price which I treat equally.

A buddy of mine had volt modded GTX 780's in triple SLI on a 1300w supernova PSU and a delidded 4790k and only barely went beyond 1000w under absolute max system load. Now he went to a single 1070, and that PSU whines when it's only at 1/4th it's rated load, lol. He also misses using it as a heater in the cold New York winters. 

 

Also, thermal output can matter somewhat in larger builds, but not in the way most think. Having less thermally demanding components means you can somewhat get away with running quieter/slower fans. For the quiet freaks, they will sometimes settle with medium-high end hardware or limit their overclocking to achieve this. Aside from this, and the ITX example you gave, thermals/power consumption doesn't/shouldn't mean much to the vast majority of consumers. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Dan Castellaneta said:

Wouldn't a better name for this be FreeSync 1.1?

well look at GCN , they dont like using fractions , just whole numbers (GCN 1.1 1.2 do not in fact exist its GCN 1 2 3 4) 

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×