Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

RX 6800 Ray Tracing Performance Leaked

Go to solution Solved by Random_Person1234,

UPDATE: Frank Azor (Chief Gaming Architect at AMD) promised that more details would be released about Big Navi ray tracing and super resolution before the launch of the cards.

https://videocardz.com/newz/amd-frank-azor-promises-more-details-on-radeon-rx-6000-ray-tracing-and-super-sampling-tech-before-launch

https://twitter.com/AzorFrank/status/1321808969283870721 

 

10 minutes ago, RejZoR said:

Technically, so is "RTX".

Technically speaking, yes.  However - as is typical with Nvidia - they throw in their own proprietary tech that requires their hardware.  Some games may work, some may be tricked into working, but I wouldn't be surprised to see games which will not run ray-tracing on AMD cards at all (not without patching, at least), because of the Nvidia implementation.

 

Now, if we see games that only do RT on AMD cards and are blocked from running on Nvidia cards, I'll eat my hat.

10 minutes ago, RejZoR said:

NVIDIA didn't invent ray tracing

Oh, I know that.  Technically, ray-tracing has been around for decades, there just wasn't hardware to handle it real time on desktops until a few years ago.  I'm not even referring to RTX, that was just the first to brand itself around the tech.

Link to post
Share on other sites
30 minutes ago, Jito463 said:

Nothing special or unique to their hardware.

AMD does have "ray tracing accelerators" built into their RX6000 GPUs. I think there's supposed to be one per CU.

Link to post
Share on other sites
9 hours ago, Random_Person1234 said:

The AMD equivalent of DLSS is not out yet, according to rumors it's being worked on. So they were using native res.

 

They could use "Radeon Boost" though, not just for these but basically a number of previous generations too.

In the end, there may be many reasons while you prefer to go with "not actually the full resolution" when you play a game, but for benchamarking and testing purposes, 1080p is 1080p and 4K is 4K. If you start to throw in DLSS, "Super Resolution", or whatever other trick to lower the load on the GPU you may as well compare a card at 1440p with another one at 1080p and call it a day...

Link to post
Share on other sites

I wonder if AMD will have a GOOD equivalent of DLSS. I've heard DLSS on the new 30 series cards is really good, so AMD has their work cut out for them there, prob can't afford to launch an equivalent to DLSS 1.0 (which iirc wasn't very good)

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

"A redline a day keeps depression at bay" - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 and 2 x Seagate ST2000DM006 (in RAID 0 for games!) - The good old Corsair GS700 - Yamakasi Catleap 2703 27" 1440p and ASUS VS239H-P 1080p 23" - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

 

Avid Miata autocrosser :D

Link to post
Share on other sites
4 hours ago, Caroline said:

Oh no. Has AMD entered the memetracing competition too? welp, byebye to low prices I guess, $400 extra on a card just because it comes with that "feature", just like nvidia

e0a71f597967638ffd90698f1fd8aa5a.png

There is no indication that AMD has specific hardware cores that handle ray tracing like Nvidia has their "RT Cores".

 

The most likely reason why they increased prices is because Nvidia did, and people are buying their cards anyway.

 

We'll know more details about RDNA 2.0 in the upcoming Architecture Day and they might reveal stuff about their Ray Tracing solution.

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to post
Share on other sites
22 minutes ago, Energycore said:

There is no indication that AMD has specific hardware cores that handle ray tracing like Nvidia has their "RT Cores".

 

The most likely reason why they increased prices is because Nvidia did, and people are buying their cards anyway.

 

We'll know more details about RDNA 2.0 in the upcoming Architecture Day and they might reveal stuff about their Ray Tracing solution.

https://hexus.net/tech/news/graphics/146497-amd-shares-rdna-2-hw-raytracing-performance-indicators/

There are Ray Accelerators (1 per CU) according to AMD.

Link to post
Share on other sites

for those of us who dont have raytracing capable cards, is there a good way to see the differences between dlss and native? and maybe rtx vs good rasterization? i had the opportunity to run a 2080 super through port royal a few times last month and the dlss runs had some nasty aliasing/pixelation that i noticed...

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to post
Share on other sites

.

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to post
Share on other sites
7 hours ago, Random_Person1234 said:

but the 3070 beats the 6800 with DLSS enabled

Well yea, because that means it's rendering a completely different thing, a thing not as demanding. DLSS is cool though but it's still not a great way to compare the performance of GPUs, not in an analytical sense anyway. Comparing with DLSS is good when looking at performance value and purchase decision making, if a cheaper card can effectively get you the same thing you might as well spend less.

Link to post
Share on other sites
5 hours ago, Caroline said:

Oh no. Has AMD entered the memetracing competition too? welp, byebye to low prices I guess, $400 extra on a card just because it comes with that "feature", just like nvidia

e0a71f597967638ffd90698f1fd8aa5a.png

This launch has increased raw performance across the board from both companies so idk why people can complain when they also are adding new features as well? The fact is price to performance has gone up with new generation even when excluding ray tracing performance. Once lower gpus come out and we get better supply you will be able to get a better gpu at the lower price range. 

Link to post
Share on other sites
1 hour ago, Energycore said:

There is no indication that AMD has specific hardware cores that handle ray tracing like Nvidia has their "RT Cores".

So there's still a bit of hope

 

48 minutes ago, Brooksie359 said:

This launch has increased raw performance across the board from both companies so idk why people can complain when they also are adding new features as well? The fact is price to performance has gone up with new generation even when excluding ray tracing performance. Once lower gpus come out and we get better supply you will be able to get a better gpu at the lower price range. 

It's not any feature, it's a pointless feature made for workstation graphics cards that have no use for regular users, it doesn't justifies a $400 increase just because of that.

 

nvidia is capable of manufacturing a graphics card without RT, they could sell much more for half of the price but of course that's not something they'd love to do

Link to post
Share on other sites
7 hours ago, RejZoR said:

Thing with ray tracing is that it's relatively straight forward process. You can't wave a magic wand and make it faster. Calculating a ray bounce can't really be optimized beyond what anyone can do. You define amount of rays per pixel and amount of bounces with certain decline of light with every bounce. And to avoid using insane amounts of rays per pixel to get clean image, you use a denoiser. NVIDIA made big deal like their denoising is some magical thing done only by them.

As someone that's personally done research into this field (and spent many hours writing their own ray tracer): no, not really.

 

Ray tracing is a game of statistics. Each pixel's data has an intrinsic uncertainty, due to raytracing having a heavy reliance on random number generators. This uncertainty manifests in the final image as noise: the larger the uncertainty across the image, the noisier it gets. We've known this since the 90s and a large amount of ray tracing research since has been about how we can reduce this uncertainty.

 

And reducing uncertainty is, essentially, the same as making it faster in a time-limited scenario. If you can get the same uncertainty using less rays, you'll likely have a shorter overall render time.

 

One of the ways to reduce this uncertainty, as you mentioned, is to use large amounts of rays per pixel. But this isn't ideal - more rays means longer render times - and this technique rapidly falls against the brick wall of diminishing returns. The difference in uncertainty between a 100 rays per pixel (rpp) image and a 200 rpp image is far larger than that between a 200 and 300 rpp image, and eventually you'll get no improvement at all. Many commercial renderers try to reduce the impact of increased ray counts by only applying them where they're needed. Rather than requiring all pixels to use the same ray count, they instead target an uncertainty level for the pixels to reach. Easy pixels like plain, matte surfaces might reach that after 10 rays, but your caustics and reflections might need 100. No need to waste ray calculations on pixels that don't need it. I haven't heard of this technique being used in games (although I admit finding information about this is rather difficult as game companies are far less open about their craft than computer graphics companies) but I imagine that's partly due to the incredibly low ray count they're using anyway (1-2 rpp I believe is common in gaming).

 

So what else can we do to improve uncertainty? Well some techniques don't apply to real time graphics, such as rendering multiple images and multiplying them together. Turns out the image obtained by merging two 100rpp images can often be less noisy than a single 200rpp image. But there are also many things we can do within the renderer itself.

 

One such way is through use of a Monte Carlo renderer. Named after the casinos and invented to design nuclear bombs, Monte Carlo Methods are mathematical tools that use random numbers to find the answer to complex mathematical equations, just like the ones that are generated while doing ray tracing. By using a probability distribution function (PDF) one can change how rays reflect within the scene, so as to reduce the probabilty of generating ray paths that will not contribute to the final image (eg, ones that will leave the scene and never hit anything) or to increase the probabilty of the opposite (eg, more rays directed towards light sources), which is known as Importance Sampling. This is a tricky process but massively improves the quality of an image with only a small performance impact. As far as I'm aware, no games use this technique.

 

Another such technique is Path tracing - Quake II RTX uses this (but renders the whole scene using it rather than just shadows and lighting, which massively changes performance) - which requires changes to the raytracing algorithm that slows it down slightly. In return for these changes, a lot of effects that you would otherwise have to code in manually are suddenly done automatically eg. soft shadows, caustics, ambient occlusion and indirect lighting. But this is just the tip of the iceberg when it comes to algorithmic tweaks you can make. Light tracing, for example, reverses the raytracing algorithm and instead shoots rays from light sources, rather than the camera. Bidirectional path tracing combines both - shooting rays from both lights and cameras, where they eventually meet mid scene. Each completed ray traversal may be more difficult with these techniques, but the uncertainty of that traversal is lower as well. As far as I'm aware, no games use any of these.

 

We can also make efficiencies regarding how the rays traverse the scene. DirectX raytracing uses a Bounding Volume Heirarchy (BVH) - which essentially encapsulates the scene in a complex series of boundary cubes, kinda like a Russian doll. If a ray doesn't hit a big box, then it definitely wont hit all the boxes inside it, meaning you don't need to bother calculating intersection tests for them. This is a common technique and pretty easy to code, but is a far cry from being the only thing we can do to improve performance. For example Metropolis Light Transport caches 'nodes' along a ray path, which can be reused later to massively reduce the amount of computation along common ray paths.

 

All of the above is about CPU raytracing, and all of it can be applied to GPU ray tracing as well. All RT cores do is accelerate the process of navigating a BVH, which is a tiny fraction of the code that's involved in writing a ray tracer. The rest is all still handled by the CPU or in an HLSL shader, and as such can benefit from the techniques I've described above. And this is all before you even think about applying a denoiser to your output.

 

So no, saying there's nothing that can be done to improve the raytracing algorithm is very much incorrect. There are many, many ways of doing this, a lot of which Nvidia invented themselves - they've been publishing a lot of papers about raytracing for years, from long before RTX.

 

My guess as to why games aren't using most of these techniques? They take time to implement well and the companies can't justify spending that time on something that barely anyone is using. The return on investment just isn't there vs a simple solution that they can use to slap 'Ray Tracing Support' on their list of advertising points.

 

(Sorry for the wall of text)

Quote

Bottom line is, results are as I've expected them. I was expecting basically the same performance and NVIDIA taking the edge in DLSS supported titles. We'll have to see what's AMD's Super Resolution feature and if it works similar to DLSS in any way. Then we can expect similar outcome on AMD's side. I just hope AMD's implementation works in all games, that would be nice. Can't think of anyone else it would be. Coz Virtual Super Resolution is already rendering at higher resolution and outputting on lower resolution display. Super Resolution can only be something similar to DLSS. We'll see what it is. Hopefully soon.

As much as I want AMD's solution to be amazing, we have to remember that DLSS leverages the tensor cores available to RTX GPUs. The RX 6000 series has no such acceleration structures and so I wouldn't be surprised if the effectiveness of their solution was hampered by this. It will have to run on the GPU itself, thereby reducing how much headroom is gained using the technique. Until AMD has hardware that can accelerate their version of DLSS, I doubt it will truly be able compete with Nvidia's solution.

My PCs:

Quote

Timothy: 

i7 4790k

16GB Corsair Vengeance DDR3

ASUS GTX 1060 6GB

Corsair Carbide 300R

 

Link to post
Share on other sites
3 hours ago, Energycore said:

There is no indication that AMD has specific hardware cores that handle ray tracing like Nvidia has their "RT Cores".

 

The most likely reason why they increased prices is because Nvidia did, and people are buying their cards anyway.

 

We'll know more details about RDNA 2.0 in the upcoming Architecture Day and they might reveal stuff about their Ray Tracing solution.

??? Wrong. AMD have dedicated hardware in each CU just like NVIDIA.

Link to post
Share on other sites

not bad considering we're looking at the 6800, which is 60CU vs the 80 CU of the 6900xt and 72 CU of the 6800xt, where these cards have 1 Ray accelerator per CU.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to post
Share on other sites
17 minutes ago, schwellmo92 said:

??? Wrong. AMD have dedicated hardware in each CU just like NVIDIA.

Yeah, don't know where people are getting that from. What else is "Ray Accelerators" supposed to mean? 

System Specs

  • CPU
    AMD Ryzen 7 3700X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    AMD RX 470
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to post
Share on other sites
2 hours ago, Caroline said:

So there's still a bit of hope

 

It's not any feature, it's a pointless feature made for workstation graphics cards that have no use for regular users, it doesn't justifies a $400 increase just because of that.

 

nvidia is capable of manufacturing a graphics card without RT, they could sell much more for half of the price but of course that's not something they'd love to do

That is purely your perspective that it isn't a useful feature. There are quite a few people who enjoy the benefits of the added hardware and just because you don't doesn't mean others won't. And do you honestly believe they would sell the current 3080 for 400 dollars less if it didn't have the raytracing feature? Fat chance on that one and same goes for AMD with their cards. If you don't care about new features and want to save money then buy used but don't complain about new features just because you don't care about them. 

Link to post
Share on other sites
38 minutes ago, BlackManINC said:

Yeah, don't know where people are getting that from. What else is "Ray Accelerators" supposed to mean? 

It has been detailed by both AMD and Microsoft.

 

amd-ray-accelerator.png

Microsoft provides more details Xbox Series X architecture | VideoCardz.com

 

 

Link to post
Share on other sites
56 minutes ago, schwellmo92 said:

??? Wrong. AMD have dedicated hardware in each CU just like NVIDIA.

Must have missed that. They do indeed have hardware Ray Tracing. I'm not sure whether that's a higher or lower percentage of the die area than Nvidia, that would be interesting to know.

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to post
Share on other sites
1 minute ago, Energycore said:

Must have missed that. They do indeed have hardware Ray Tracing. I'm not sure whether that's a higher or lower percentage of the die area than Nvidia, that would be interesting to know.

Those are what they call "Ray Accelerators" on the official spec pages right? 

System Specs

  • CPU
    AMD Ryzen 7 3700X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    AMD RX 470
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to post
Share on other sites
2 minutes ago, BlackManINC said:

Those are what they call "Ray Accelerators" on the official spec pages right? 

Right. I don't know details but you can clearly see that a part of the silicon was dedicated to raytracing performance

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to post
Share on other sites
1 minute ago, Energycore said:

Right. I don't know details but you can clearly see that a part of the silicon was dedicated to raytracing performance

Yeah, so I guess good things are coming ahead of us when it comes to performance, as games become better optimized for it. 

System Specs

  • CPU
    AMD Ryzen 7 3700X
  • Motherboard
    Gigabyte AMD X570 Auros Master
  • RAM
    G.Skill Ripjaws 32 GBs
  • GPU
    AMD RX 470
  • Case
    Corsair 570X
  • Storage
    Samsung SSD 860 QVO 2TB - HDD Seagate B arracuda 1TB - External Seagate HDD 8TB
  • PSU
    G.Skill RipJaws 1250 Watts
  • Keyboard
    Corsair Gaming Keyboard K55
  • Mouse
    Razer Naga Trinity
  • Operating System
    Windows 10
Link to post
Share on other sites

The big question seems to be how good is dlss2 really in gaming? I’ve heard it looks great in stills. I’ve also heard it’s not so great if things are moving though.  Stuff will come out and be tested head to head.  We shall see what we shall see.  If the 6800 can’t provide an equal game experience to a3700 the $50 price undercut may not be enough.

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
6 hours ago, Energycore said:

There is no indication that AMD has specific hardware cores that handle ray tracing like Nvidia has their "RT Cores".

 

The most likely reason why they increased prices is because Nvidia did, and people are buying their cards anyway.

 

We'll know more details about RDNA 2.0 in the upcoming Architecture Day and they might reveal stuff about their Ray Tracing solution.

Is there a date on architecture day? Can't seem to find anything about it.

AMD Ryzen 7 5800X | ASUS Strix X570-E | G.Skill 32GB 3600MHz CL16 | PALIT RTX 3080 10GB GamingPro | Samsung 850 Pro 2TB | Seagate Barracuda 8TB | Sound Blaster AE-9 MUSES

Link to post
Share on other sites
7 hours ago, Energycore said:

There is no indication that AMD has specific hardware cores that handle ray tracing like Nvidia has their "RT Cores".

 

The most likely reason why they increased prices is because Nvidia did, and people are buying their cards anyway.

 

We'll know more details about RDNA 2.0 in the upcoming Architecture Day and they might reveal stuff about their Ray Tracing solution.

This turns out to be not quite the case.  Instead of a dedicated section, they built it into apparently every single core.  I suppose it’s possible they’re flat out lying because they know it can’t be checked.  Wouldn’t be the first time a marketing department did that.  Not sure how well this will work, but it’s how they claim they did it.   They’ve got a DLSS variant as well, though it apparently works a lot more differently from DLSS than the ray tracing stuff they put in their cores works from RT.  which of course means nothing if it isn’t  in the software.

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×