Jump to content

Nvidia 30 Series unveiled - RTX 3080 2x faster than 2080 for $699

illegalwater

Based on the wattage they showed on the show. What would be the recommended PSU wattage for each card?

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, GoodBytes said:

HDMI is not free. It has royalty cost associated to it. This is why you don't have 20 of them on a TV.

Now I'm trying to think of a use case where you'd need that many on a TV...

 

13 hours ago, GoodBytes said:

DisplayPort is fully backward compatible down to HDMI and single-link DVI. HDMI can't be upgraded to DP.

The 3 video modes just mentioned are part of the DisplayPort standard, which ensures no adapter mess. You can easily get a cable where it is DP on one end, and the other HDMI, without any circuitry or conversion, ensuring 0 latency penalty (same for single link DVI). However, HDMI cannot be converted to DisplayPort. HDMI doesn't recognize DP in its standard.

To my understanding, it is not mandatory to support HDMI from a DP connector. Wouldn't it also incur the HDMI licence requirement if it did? That's why we have a mix of passive and active adapters available, depending on the capabilities of the source. I do not know that you can adapt from an nvidia or AMD GPU to HDMI passively. I know you can on not too old Dell desktop PCs using Intel Graphics, which are marked with DP++ indicating so.

 

13 hours ago, GoodBytes said:

HDMI also has under and over scan issues, a problem that DP doesn't have. Yes, I am tired of people complaining that they can't see their task bar or the image is not full screen despite native resolution set (or not detecting their screen resolution... another problem). DP, like DVI, just works.

I think that is a display choice and not due to the connection method. I've not encountered that with HDMI connected monitors, but it is very much a thing with TVs where content is normally overscanned by default, until you turn it off ideally at display than using scaling tricks to make it fit from source.

 

I do have one hate with HDMI: the limited dynamic range option which seems to default on more often than not, resulting in washed out images until the setting is set to full range.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, u4ea said:

Based on the wattage they showed on the show. What would be the recommended PSU wattage for each card?

3070 is 220 watts, 3080 is 320 watts and 3090 is 350 watts

it depends on your cpu but the nvidia recommended wattage based on the 10900k is 650 watts for the 3070 750 watts for the 3080 and 3090

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, spartaman64 said:

3070 is 220 watts, 3080 is 320 watts and 3090 is 350 watts

it depends on your cpu but the nvidia recommended wattage based on the 10900k is 650 watts for the 3070 750 watts for the 3080 and 3090

Damn, was going to buy the NZXT H1, but that only has a 650w PSU built-in. And I don't want to swap that one out. 😛

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

Now I'm trying to think of a use case where you'd need that many on a TV...

 

Blueray player.

Xbox 360

Xbox One

PS3 

PS4

WII U

Switch

PC

Thats 8

            Lets see, with adaptors,

NES

SNES
N64
Game Cube

PS1

PS2
OG Xbox

Genesis

Saturn

A2600

Dreamcast

A5200

Just now, u4ea said:

Damn, was going to buy the NZXT H1, but that only has a 650w PSU built-in. And I don't want to swap that one out. 😛

What CPU do you have? At that much performance, you can undervolt a bit, so it pulls less wattage, then get a new PSU eventually.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Drama Lama said:

Let’s wait 

Don't wanna wait anymore. 😭

I'm running a 1080 and I desperately need an upgrade. So it's 10GB or 20GB 3080 now and I'm not sure if waiting for the 20GB version will be worth it. 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure if someone posted it already, but Digital Foundry uploaded a video and claimed that in most games it's a bump of 75-95% increase in framerate between the 2080 and 3080.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

Discussed just one page back and mostly dismissed as "advertisement" since it's the games and settings NVidia allowed to show and no detailed data.

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

I think 10GB is enough for 1440p. (4K and 8K need much VRAM)

 

I will buy RTX 3080 to replace my RTX 2080 Super.

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Senzelian said:

Don't wanna wait anymore. 😭

I'm running a 1080 and I desperately need an upgrade. So it's 10GB or 20GB 3080 now and I'm not sure if waiting for the 20GB version will be worth it. 

Meanwile me with 1050: still good for my usage

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Kilrah said:

Discussed just one page back and mostly dismissed as "advertisement" since it's the games and settings NVidia allowed to show and no detailed data.

NVIDIA's graphs in presentation were advertisement. The foundry's results are just that, results. They weren't running them at 720p or some other nonsense. And having Metro, Control, Tomb Raider seems fairly reasonable set. Sure I wouldn't mind GTA5 or something, but does it really matter at this point? When you have such good results in few games, chances are, you'll have in all of them.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, RejZoR said:

NVIDIA's graphs in presentation were advertisement. The foundry's results are just that, results. They weren't running them at 720p or some other nonsense. And having Metro, Control, Tomb Raider seems fairly reasonable set. Sure I wouldn't mind GTA5 or something, but does it really matter at this point? When you have such good results in few games, chances are, you'll have in all of them.

They're controlled results and with no FPS numbers, 1%/0.1% lows, impact of features like RTX etc...

 

The presentation gave numbers, this video shows how they got those numbers but that's about it.

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, nick name said:

Well Lenovo posted some products using a 3070 Ti with 16GB so if those are accurate product specs then maybe a 3080 Ti?

Lenovo also has a 1180 "confirmed".

Not sure there is a company I would trust less with this kinda stuff.

They simply add names they got from leaks themselves as placeholders for their future products.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, RejZoR said:

I kinda like NVIDIA ditching Ti and Super nonsense (hopefully) and streamlining the lineup. If we later get RTX 3060, RTX 3050 and RTX 3030 and that should be enough (I actually think they'll go with RTX across the board and RTX 3030 might actually perform on a level of RTX 2060 which means actually usable RTX despite being lowest end card. Given the price point I think they're just gonna stick with it and keep selling these without any price drops unless AMD has anything to say about it. But I don't think they'll release any refreshes until next generation. They made massive performance leap and they can hold on these for longer. And same applies to consumers.

welp.. that comment was short lived lol

Eg5F51wXsAEa_Kf.png

 ^

PC: 
MSI B450 gaming pro carbon ac              (motherboard)      |    (Gpu)             ASRock Radeon RX 6950 XT Phantom Gaming D 16G

ryzen 7 5800X3D                                          (cpu)                |    (Monitor)        2560x1440 144hz (lg 32gk650f)
Arctic Liquid Freezer II 240 A-RGB           (cpu cooler)         |     (Psu)             seasonic focus plus gold 850w
Cooler Master MasterBox MB511 RGB    (PCcase)              |    (Memory)       Kingston Fury Beast 32GB (16x2) DDR4 @ 3.600MHz

Corsair K95 RGB Platinum                       (keyboard)            |    (mouse)         Razer Viper Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

I'm quietly excited about the 3070/80, but really intrigued by the 3090. It seems like it is truly going to be a monster. But I'll wait for Linus, the two Steve's and Jay to get some time with it.

Steve from GN said a few times he is interested in the thermals, sound and design. That is indeed interesting, but I really want to see how different CPU's cope with the 3080 and more specifically the 3090 and how PCIe 3.0/4.0 might mean for specific workloads a CPU change may be in order for some people or companies.

CORSAIR RIPPER: AMD 3970X - 3080TI & 2080TI - 64GB Ram - 2.5TB NVME SSD's - 35" G-Sync 120hz 1440P
MFB (Mining/Folding/Boinc): AMD 1600 - 3080 & 1080Ti - 16GB Ram - 240GB SSD
Dell OPTIPLEX:  Intel i5 6500 - 8GB Ram - 256GB SSD

PC & CONSOLE GAMER
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Kilrah said:

They're controlled results and with no FPS numbers, 1%/0.1% lows etc...

There will be no major surprises. Why would there?

The test done is actually all you need. You want to know the uplift, there you go.

 

How much could go wrong after that? Is it possible the new GPUs have terrible 0.1% lows? Well, maybe... but most certainly not likely!?

And if all those games run that well, most of which are the standard go to games for most reviews anyways, how are the chances that all other games suddenly run way worse?

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Tech Enthusiast said:

Lenovo also has a 1180 "confirmed".

Not sure there is a company I would trust less with this kinda stuff.

They simply add names they got from leaks themselves as placeholders for their future products.

Noted.  

 

I also hate this faulty Lenovo laptop that I'd feel to guilty selling or giving away so I have to keep using it.  So Lenovo isn't a name I trust much anymore.  

AMD Ryzen 5800XFractal Design S36 360 AIO w/6 Corsair SP120L fans  |  Asus Crosshair VII WiFi X470  |  G.SKILL TridentZ 4400CL19 2x8GB @ 3800MHz 14-14-14-14-30  |  EVGA 3080 FTW3 Hybrid  |  Samsung 970 EVO M.2 NVMe 500GB - Boot Drive  |  Samsung 850 EVO SSD 1TB - Game Drive  |  Seagate 1TB HDD - Media Drive  |  EVGA 650 G3 PSU | Thermaltake Core P3 Case 

Link to comment
Share on other sites

Link to post
Share on other sites

The only leg left to stand on for AMD is process and power advantage. If they can bang out a 3070 competitor at better wattage and undercut it, they'll be good, but I have my doubts. This is very bad for competition.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Kilrah said:

They're controlled results and with no FPS numbers, 1%/0.1% lows, impact of features like RTX etc...

 

The presentation gave numbers, this video shows how they got those numbers but that's about it.

You can derive framerate from percentages by knowing how much RTX 2080 gets in those games. Also, people really care so much about low percentile? Unless you have a crappy bottlenecked system full of all sorts of other issues, it's irrelevant imo. Especially now that both AMD and NVIDIA both care a lot about frame time pacing (Ultra Low Latency, Reflex...).

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, NE0XY said:

Thanks, 

You mean the HydroCopper right? I had a 980 with HC block a couple of years ago.  

Actually didn't look properly the kingpin is also water cooled but it already comes with an AIO and it's the same for the hybrid so yes it's the HydroCopper for a custom loop.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, MarcoZ said:

The only leg left to stand on for AMD is process and power advantage. If they can bang out a 3070 competitor at better wattage and undercut it, they'll be good, but I have my doubts. This is very bad for competition.

i think they have a 3080 competitor and the fact that nvidia is pricing things this well is an indicator that they might have something really competitive 

unknown.pngunknown.png

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, spartaman64 said:

i think they have a 3080 competitor and the fact that nvidia is pricing things this well is an indicator that they might have something really competitive 

unknown.pngunknown.png

Don't know about the 3080... The horsepower will definetly be there, we'll need to see the drivers. That will be too good to be true.

It's likely a 3070 competitor, probably 100$ cheaper.

 

Being more efficient tho, AMD might be able to squeeze more performance out of it.

The waiting is killing me.

MOTHERBOARD: ASRock H97 Pro4 CPU: Intel Core i5-4460 @3.30 Ghz Intel Xeon E3-1271v3 @4.00 Ghz RAM: 32Gb (4x8Gb) Kingstone HyperX Fury DDR3@1600 Mhz (9-9-9-27)

GPU: MSI 390 8Gb Gaming Edition PSU: XFX TS 650w Bronze Enermax Revolution D.F. 650w 80+ Gold MOUSE: Logitech G502 Proteus Spectrum KEYBOARD: Monokey Standard Suave Blue

STORAGE: SSD Samsung EVO 850 250Gb // HDD WD Green 1Tb // HDD WD Blue 4Tb // HDD WD Blue 160Gb CASE: Fractal Design Define R5 Windowed OS: Windows 11 Pro x64 Bit

MONITORS: Samsung CFG7 C24FG7xFQ @144hz // Samsung SyncMaster TA350 LT23A350 @60hz Samsung Odyssey G7 COOLER: Noctua NH-D15

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

1 hour ago, porina said:

To my understanding, it is not mandatory to support HDMI from a DP connector. Wouldn't it also incur the HDMI licence requirement if it did? That's why we have a mix of passive and active adapters available, depending on the capabilities of the source.

HDMI fees is on the HDMI port and HDMI cables from my understanding. Input and output.

 

Quote

I do not know that you can adapt from an nvidia or AMD GPU to HDMI passively. I know you can on not too old Dell desktop PCs using Intel Graphics, which are marked with DP++ indicating so.

I do not have a GPU with DP to test, but this has not been a problem for ages. DP++ is no longer mentioned on products these days, as it is widely implemented to a point that it is unexpected that it doesn't support it. Graphics cards with limited number of DP ports or inability to output HDMI via DisplayPort has more to do with cost cutting than anything (it does add complexity, challenges that should be completely solved these days via integrated solutions, hence why DP++ marking is no longer in use).

 

My old Dell Latitude E6400 from 2008 (Core 2 Duo), which was one of the early laptops with this display connector, had no problem outputting to HDMI. So I am surprised that your Dell desktop doesn't support it. Then again, my laptop was powered by a Nvidia GPU, but I recall Intel model (Intel GPU was in the chipset at the time), did support it as well. So, I guess Dell wanted to save a penny on your desktop? Or you had a faulty cable?

 

1 hour ago, porina said:

I think that is a display choice and not due to the connection method. I've not encountered that with HDMI connected monitors, but it is very much a thing with TVs where content is normally overscanned by default, until you turn it off ideally at display than using scaling tricks to make it fit from source.

What would be purpose to overscan an image?

 

1 hour ago, porina said:

I do have one hate with HDMI: the limited dynamic range option which seems to default on more often than not, resulting in washed out images until the setting is set to full range.

That is because HDMI was designed for TVs, and low cost TVs with shitty panels was kept in mind. HDMI didn't care about PCs.

DisplayPort is the reverse. Its focus is on PCs. Hence why it is not too interested in adding ARC support

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Senzelian said:

Don't wanna wait anymore. 😭

I'm running a 1080 and I desperately need an upgrade. So it's 10GB or 20GB 3080 now and I'm not sure if waiting for the 20GB version will be worth it. 

1080?! I have a 680! You can live with what you have longer.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Parideboy said:

Don't know about the 3080... The horsepower will definetly be there, we'll need to see the drivers. That will be too good to be true.

It's likely a 3070 competitor, probably 100$ cheaper.

 

Being more efficient tho, AMD might be able to squeeze more performance out of it.

The waiting is killing me.

there were rumors that big navi is 40-50% faster than 2080 ti

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×