Jump to content

AMD Acquired HiAlgo. Bye bye stuttering.

Prysin

Source: http://www.amd.com/en-us/press-releases/Pages/amd-acquires-software-2016jun29.aspx

 

Quote

AMD (NASDAQ: AMD) today announced the acquisition of software company HiAlgo Inc., a developer of unique PC gaming technologies designed to help Radeon™ RX Series GPUs transform gaming experience, increase GPU efficiency and improve the overall consistency of gaming experiences. The acquisition lays the groundwork for future gaming innovation in Radeon Software that will benefit owners of Radeon™ RX Series GPUs.

“Software is an integral part of advancing the science of graphics, enabling us to best harness the silicon of the GPU to 
maximize performance and deliver outstanding experiences in games and applications,” said Raja Koduri, senior vice president and chief architect, Radeon Technologies Group, AMD. “HiAlgo embodies our spirit of passion, persistence and play by delivering a number of creative approaches to software that improve gamers’ experiences, and helps future-proof1 the GPU.”

 

 

Now what is HiAlgo? It is a company, most known for a mod for Skyrim, that smoothed out frametimes to get rid of stuttering. They did so by applying a wrapper over the DX9 API.

 

Now, Raja Koduri, the VP of Radeon Technologies Group explains this pretty darn well here:

 

 

incase link didnt work: he starts explaining at 43:45

 

 

 

Personally, i think this is DAMN nice. If AMD is able to improve frame latencies with this,  then even if it doesnt match up in FPS, if the frame delivery is 100% perfect, aka NO STUTTERING, then that is perhaps even more important then a few FPS less (assuming you are over 60 FPS avg).

One of the big selling points of Maxwell vs say the Fury and Nano, was that the 980 was remarkebly stable in its frame delivery, while most AMD cards, due to driver overhead and whatnot, always have a little bit stutter.

 

Scott Wasson, another AMD employee and former Editor in Chief of The Tech Report (The review site that alongside Nvidia Pioneered frame latency testing) posted this image on Twitter the other day:
https://mobile.twitter.com/scottwasson/status/748162402265403392/photo/1

 

If this graph is true, that DAMN that is good. The 970 is showing REALLY bad microstuttering in that graph. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Great news. I'm really looking forward to the results of this acquisition. If the latency is greatly reduced even in multi-GPU configurations, I'll be very impressed.

From salty to bath salty in 2.9 seconds

 

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds interesting, but it does mean Freesync is only useful for tearing then.

 

To be fair, the 970 has massive stutter, because of the weak 3.5+05GB vram, compared to the 8GB vram. Games simply use more than 4GB today with highest texture resolution, so it will stutter heavily as it's reading system RAM.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I'm looking forward for the fruits of this purchase, I can see it having a great effect on VR as well.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

YES! AMD is doing everything right at the moment, please keep it this way AMD!

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Huh. I'd be interested how heavily they market this tech, and what kind of impact it will have. 

- snip-

Link to comment
Share on other sites

Link to post
Share on other sites

HiAlgo doesn't have a particularly good reputation in the world of Skyrim modding. It works by lowering resolution when you're in motion (or panning the screen), then raising it again afterward, and a lot of people seem to feel that it really messes up your pretties. Also, it tends to have issues with ENBs - HiAlgo and ENB development seem to take turns breaking each other.

 

I recently tested it on a very heavily modded Skyrim installation (including an ENB) on my current, fairly high end rig (Skylake i7, 2x 290x in CF), and it DID seem to smooth framerate in some places, but it also caused some weird visual effects. The strengths didn't outweigh the weaknesses.

 

OTOH, a couple years back (pre ENB), I had some good experiences with it, like a week I spent at my mother-in-law's place: thanks to HiAlgo I has able to run my modded Skyrim install on a weak laptop that would otherwise NOT have been usable. And I have to say that I, personally, hardly noticed the graphical degradation - and REALLY noticed the steady FPS. 

 

I'll be particularly interested to see how it does when implemented through drivers instead as an add-on .dll.

 

Spoiler

PSU: Cooler Master V1200 Platinum / MB: Asus ROG Strix X570-E Gaming / CPU: AMD Ryzen 7 3700x / RAM: G.Skill Trident Z Neo 32GB (2x16GB) / GPU: Gigabyte GeForce RTX 3090 Gaming OC 24GB / OS: Windows 11 / Screen: Samsung CRG9 (5120 x 1440) / Case: DIY Bench built custom into a a cabinet / Case Fans: 4x BeQuiet Magicool 140mm Pure Wings / Rad: Magicool 180 Triple / Pump: Aquastream XT / Res: Aquacomputer aqualis PRO 450ml / CPU Block: EK Supremacy Clear Acetal / GPU Blocks: Bykski N-GV1080TIG1-X with VRAM Cooling via B-3090TC-X Water Block

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, mikegray said:

HiAlgo doesn't have a particularly good reputation in the world of Skyrim modding. It works by lowering resolution when you're in motion (or panning the screen), then raising it again afterward, and a lot of people seem to feel that it really messes up your pretties. Also, it tends to have issues with ENBs - HiAlgo and ENB development seem to take turns breaking each other.

 

I recently tested it on a very heavily modded Skyrim installation (including an ENB) on my current, fairly high end rig (Skylake i7, 2x 290x in CF), and it DID seem to smooth framerate in some places, but it also caused some weird visual effects. The strengths didn't outweigh the weaknesses.

 

OTOH, a couple years back (pre ENB), I had some good experiences with it, like a week I spent at my mother-in-law's place: thanks to HiAlgo I has able to run my modded Skyrim install on a weak laptop that would otherwise NOT have been usable. And I have to say that I, personally, hardly noticed the graphical degradation - and REALLY noticed the steady FPS. 

 

I'll be particularly interested to see how it does when implemented through drivers instead as an add-on .dll.

ENBs and HiAlgo both use wrappers. The reason they break eachother is that the wrapper is a software sitting outside the API and driver. It can only apply things AFTER the fact.

 

What AMD/RTG is intending to do, is implement the tech at a driver level. This would offer MUCH more granular control then a simple overlay would.

Link to comment
Share on other sites

Link to post
Share on other sites

Cool... Perhaps in the future this could make multi GPU setups worth it again....

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Notional said:

Sounds interesting, but it does mean Freesync is only useful for tearing then.

 

To be fair, the 970 has massive stutter, because of the weak 3.5+05GB vram, compared to the 8GB vram. Games simply use more than 4GB today with highest texture resolution, so it will stutter heavily as it's reading system RAM.

My gtx 760 stutters before it reaches its vram limit. So I dont agree with your theory.

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

does AMD/RTG really have the money to acquire, even if their really small

Spoiler

My system is the Dell Inspiron 15 5559 Microsoft Signature Edition

                         The Austrailian king of LTT said that I'm awesome and a funny guy. the greatest psu list known to man DDR3 ram guide

                                                                                                               i got 477 posts in my first 30 days on LinusTechTips.com

 

Link to comment
Share on other sites

Link to post
Share on other sites

rise of the tomb raider is a horrible implementation of dx12. I'm surprised the graph looks coherent with the 970 at all.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Thony said:

My gtx 760 stutters before it reaches its vram limit. So I dont agree with your theory.

I didn't state VRAM limitation is the only thing that causes stutter, but running out of VRAM will ALWAYS cause stutter, due to huge latencies from reading from system RAM instead of VRAM.

Case and point:

50 minutes ago, Briggsy said:

rise of the tomb raider is a horrible implementation of dx12. I'm surprised the graph looks coherent with the 970 at all.

Indeed, however DX12 in RotTR does seem to raise the minimum fps, so you should generally get a more consistent framerate.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, themaniac said:

does AMD/RTG really have the money to acquire, even if their really small

amd is still a pretty big company

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, spartaman64 said:

amd is still a pretty big company

that is true but afaik they are still losing money so why are they buying companies even if they are small if they cant even make a profit

Spoiler

My system is the Dell Inspiron 15 5559 Microsoft Signature Edition

                         The Austrailian king of LTT said that I'm awesome and a funny guy. the greatest psu list known to man DDR3 ram guide

                                                                                                               i got 477 posts in my first 30 days on LinusTechTips.com

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, themaniac said:

that is true but afaik they are still losing money so why are they buying companies even if they are small if they cant even make a profit

if amd doesnt innovate then there is no way for them to make back market share. obviously that company has something that amd sees as valuable and if amd dont do things like this they will be left behind

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, sof006 said:

YES! AMD is doing everything right at the moment, please keep it this way AMD!

Everything except complying with the motherboard pci-e specifications:

 

https://www.reddit.com/r/Amd/comments/4qfwd4/rx480_fails_pcie_specification/

 

So yeah, not everything. Not by a long shot.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, themaniac said:

that is true but afaik they are still losing money so why are they buying companies even if they are small if they cant even make a profit

They're expecting higher sales soon.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

Even though the 480 ended up between the 970 and 980 this might make CF 480s a viable option to compete with 1070s.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, don_svetlio said:

Even though the 480 ended up between the 970 and 980 this might make CF 480s a viable option to compete with 1070s.

I really want to know what you have been reading, but most reviews (video or not) report it performing slightly worse than the 970 in quite a number of theoretical and real-world scenarios.

Read the community standards; it's like a guide on how to not be a moron.

 

Gerdauf's Law: Each and every human being, without exception, is the direct carbon copy of the types of people that he/she bitterly opposes.

Remember, calling facts opinions does not ever make the facts opinions, no matter what nonsense you pull.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Colonel_Gerdauf said:

I really want to know what you have been reading, but most reviews (video or not) report it performing slightly worse than the 970 in quite a number of theoretical and real-world scenarios.

 

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Colonel_Gerdauf said:

I really want to know what you have been reading, but most reviews (video or not) report it performing slightly worse than the 970 in quite a number of theoretical and real-world scenarios.

But then again it's even faster than a 980 in "a number of theoretical and real-world scenarios".

Here is the thing, as always, the card has day 1 drivers and the performance will increase over time, same as Fury X, 330X, NV cards and every card ever. I could say "expect 20% better performance in future", but more realistically I'm hoping for 10-15%. 

And even with current performance and all combined, it performs better than the 970 according to TechPowerUp and Guru3D. Those are the ones that take everything into account that I can think off and that do proper testing. I'm not taking LTT and such mediocre reviews into account. You could say I'm cherry picking it, that some sites like Tom's HW etc. show it being slower than 970 in many scenarios and that's absolutely correct, so do the sites that I've mentioned, but then again, as I've said, it's on pair and even faster than a 980 in some tests and you can see that for yourself. All in all it's faster than a 970, but it's not that consistent, and not by a huge margin. 

And I can't say that it's bad performing because it isn't... to a degree. The reference cooler is hands down horrible, and I've been saying that for few weeks now. Other things like the power delivery, OCing and temps are being improved with AIB cards that are showing an 8 pin connector (like from Sapphire, the fact that it draws more than advertised is till there, but the latest drivers have lowered, for instance, idle draw from 16 to 11W, so I'll bet something can be done there, even if by a small margin), OCs from 1480-1600 MHz, according to HardOCP chief editor, and we've seen 63C on cards that run at 1350, with moded cards, not even proper AIB ones. 

So it definitely should not a bad card for the price, which is stupid at the moment and way higher in most parts than it should be. I'll bet that AIB cards that will run at 1500 MHz with 8 pins will be great. And who knows, those maybe will be on pair with the 980, seeing how Guru3D got 1380MHz from their card and got ~10% better performance. I personally don't care about power delivery, if it's 150W or 200W. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Bouzoo said:

But then again it's even faster than a 980 in "quite a number of theoretical and real-world scenarios".

Here is the thing, as always, the card has day 1 drivers and the performance will increase over time, same as Fury X, 330X, NV cards and every card ever. I could say "expect 20% better performance in future", but more realistically I'm hoping for 10-15%. 

And even with current performance and all combined, it performs better than the 970 according to TechPowerUp and Guru3D. Those are the ones that take everything into account that I can think off and that do proper testing. I'm not taking LTT and such mediocre reviews into account. You could say I'm cherry picking it, that some sites like Tom's HW etc. show it being slower than 970 in many scenarios and that's absolutely correct, so do the sites that I've mentioned, but then again, as I've said, it's on pair and even faster than a 980 in some tests and you can see that for yourself. All in all it's faster than a 970, but it's not that consistent.

And I can't say that it's bad performing because it isn't... to a degree. The reference cooler is hands down horrible, and I've been saying that for few weeks now. Other things like the power delivery, OCing and temps are being improved with AIB cards that are showing an 8 pin connector (like from Sapphire, the fact is that it draws more than advertised is till there, but the latest drivers have lowered, for instance, idle draw from 16 to 11W, so I'll bet something can be done there, even if by a small margin), OCs from 1480-1600 MHz, according to HardOCP chief editor, and we've seen 63C on cards that run at 1350, with moded cards, not even proper AIB ones. 

So it's definitely should not a bad card for the price, which is stupid at the moment and way higher in most parts than it should be. I'll bet that AIB cards that will run at 1500 MHz with 8 pins will be great. And who knows, those, maybe will be on pair with the 980, seeing how Guru3D got 1380MHz from their card and got ~10% better performance. I personally don't care about power delivery, if it's 150W or 200W. 

Exactly - DigitalFoundry and Science Studio show it beating the 970 and in SS' case it ran at 78*C at reasonable fan levels. For some reason engineering samples are wildly inconsistent.

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, don_svetlio said:

Exactly - DigitalFoundry and Science Studio show it beating the 970 and in SS' case it ran at 78*C at reasonable fan levels. For some reason engineering samples are wildly inconsistent.

I've seen that it runs in load at 80+ C and I think that's more than realistic, since that is wildly consistent to all that I've seen, from sites like Guru3D, TPUP and Tom's HW. And the noise is... well nothing worse than a 290X or 980Ti from their tests.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×