Jump to content

3080's 10GB VRAM: Is it enough?

Consul

As a person who is thinking about buying a 3080 at some point, I'd like to ask a question.

I've been seeing most people say that 10GBs of VRAM is not enough for the near future (2021-2022)

I now have a 1080p monitor, I'll be using the 3080 with that first, and when I get the money, I might upgrade to a 1440p 144hz monitor.

 

So the questions are: Is 10GB's of VRAM that is in the 3080 enough? Does GDDR6X make up for the shortage of VRAM when compared to 2080ti's 11GBs of VRAM?

 

Link to comment
Share on other sites

Link to post
Share on other sites

10GB is overkill,

Most modern GPUs have 6GB or 8GB.

I have 6GB and it's more than enough for 1440p.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

Truth be told, nobody knows. As far as this moment, MS Flight Sim will max out all the VRAM you can throw at it on high resolutions but that is the exception, rather than the rule. The vast majority of games are perfectly fine with 8GB or less right now.

 

The main cause for concern is that both next-gen Xbox and PlayStation consoles are shipping with 16GB of VRAM, and the broad assumption is that when you give developers a resource like that to play with, they will find a way to use it, and subsequently cross-platform titles will develop an appetite for more VRAM than is commonly available. But it is impossible to say until it happens.

Current build: AMD Ryzen 7 5800X, ASUS PRIME X570-Pro, EVGA RTX 3080 XC3 Ultra, G.Skill 2x16GB 3600C16 DDR4, Samsung 980 Pro 1TB, Sabrent Rocket 1TB, Corsair RM750x, Scythe Mugen 5 Rev. B, Phanteks Enthoo Pro M, LG 27GL83A-B

Link to comment
Share on other sites

Link to post
Share on other sites

just wait there will be a 20gb 3080 and 16gb 3070 so you'll have more choices

and AMD also will have other gpus

so wait until early November

Link to comment
Share on other sites

Link to post
Share on other sites

yes it is enough for 1440p gaming. For how long well I don't know depends on the game. I wouldn't worry about it tho because AAA games use around 7gbs of vram at 1440p at the current year. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, FaxedForward said:

The main cause for concern is that both next-gen Xbox and PlayStation consoles are shipping with 16GB of VRAM

That's a misunderstanding,

In next gen consoles the CPU will be using 8GB of VRAM as RAM,and the GPU will have 8GB available for it.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

There was a (was it GN or Jayz)video recently, where they talked about how most of the time, VRAM can be "maxed out" when it was just all allocated.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, FaxedForward said:

The main cause for concern is that both next-gen Xbox and PlayStation consoles are shipping with 16GB of VRAM

That's 16GB total system RAM, (it's using GDDR6 for everything instead of DDR4 for system RAM like a PC) with half being dedicated to the GPU, much the same way integrated graphics work on PCs. That means it's 8GB of RAM dedicated to the GPU, and 8GB dedicated to the system. This is an extremely average RAM arrangement for mid-low end gaming PCs currently on the market.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, BTGbullseye said:

That's 16GB total system RAM, with half being dedicated to the GPU, much the same way integrated graphics work on PCs. That means it's 8GB of RAM dedicated to the GPU, and 8GB dedicated to the system. This is an extremely average RAM arrangement for mid-low end gaming PCs currently on the market.

Not the same way, usually the PC igpu memory is not split evenly. Iirc an a 4gb laptop, it's > 1gb dedicated to the igpu.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, HelpfulTechWizard said:

There was a (was it GN or Jayz)video recently, where they talked about how most of the time, VRAM can be "maxed out" when it was just all allocated.

It was a GN video, don't remember which one exactly but if I remember correctly the general point was that the system will show that all the memory is allocated, which might be true, but just because it is allocated does not mean it is actually being actively used. 

 

Edit: Found the video, it was the 3080 FE review linked below. The part about the VRAM is at 2:58 - 4:28.

 

Edited by The_russian
Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, HelpfulTechWizard said:

Not the same way, usually the PC igpu memory is not split evenly. Iirc an a 4gb laptop, it's > 1gb dedicated to the igpu.

It's dynamic. It automatically adjusts based on allocation. It defaults to an 8/8 split. This is identical to how iGPUs on PCs work. (PCs automatically split it half and half if at 16GB or more RAM)

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shimejii said:

10GB is plenty. Youll be fine for the next 3-4 years, those that say otherwise need to get rid of their crystal balls. The games will run with what they are given, and 10 GB is plenty for them.

Sure they will, but will they run fast enough to play?  An ultralight laptop Uclass iGPU will RUN a triple A game.  The problem is 12fps isn’t useable.  There are several reasons but the one that matters most for memory size is swap.

 A Swapfile is hard drive space used as memory. Hard drives used to BE ram.  There was this stuff called drum memory that was basically a barrel shaped hard drive. The whole sector/track thing comes directly from that system.  a modern computer has as much memory as it needs because it digs into slow storage to get it.  You don’t see many “out of memory” errors these days. The problem with swap is it runs at storage speeds which are orders of magnitude slower than DDR(number) Any swap at all and you’ve got big problems.  Lots of swap and they could be fatal. Games run on buffered frames. Creates time limitations.

 

Back in college I used to dig into swap all the time.  I liked to tile gigantic pictures in quark on PowerPC Macs. Files were too big for ram.  The things would take days to print. It could do it though. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Vishera said:

That's a misunderstanding,

In next gen consoles the CPU will be using 8GB of VRAM as RAM,and the GPU will have 8GB available for it.

That clears everything up if that's the case in next-gen consoles.

 

1 hour ago, YsGrandi said:

just wait there will be a 20gb 3080 and 16gb 3070 so you'll have more choices

and AMD also will have other gpus

so wait until early November

That is true, however there are speculations about a big price difference between 10gb model and 20gb model and I don't think I will really need a 20GB GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Consul said:

.

 

it's "enough", though there's solid evidence that the 3080 is gimped for a card between the 3080 and 3090 later for those that wait.

 

games that crash on the 2080 ti due to not enough vram: RE3R with maxed out settings at 4k, BL3 when there are enough guns next to the safe, gtav with certain texture mods above 1080p.

 

games that actually maxes out vram but do not crash: flight sim, zero dawn, bfv on certain maps.

 

it's still pretty fringe but definitely not futureproofed. 

 

a 3080 for 700usd is a good deal, don't let the vram thing stop you, just know that it's cutting it close.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...
On 9/21/2020 at 3:55 AM, Vishera said:

10GB is overkill,

Most modern GPUs have 6GB or 8GB.

I have 6GB and it's more than enough for 1440p.

100% Not enough.

Yes it may be enough for now at 1440p, but not after next gen, currently some games already use more than 10gb of Vram at 1440p, imagine a few years later.

Do you still remember 5 years ago when people said "4gb is more than enough and 6gb is overkill?" look at us now.

Imagine 5 years later how much Vram games will use (My guess around 16gb). 

Even the AMD cards already have 16gb of Vram, but if you're planning on upgrading than it's no problem at all.

 

Currently using a EVGA FTW3 Ultra RTX 3080+Ryzen 7 5800x

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Nick Gurr said:

100% Not enough.

This. 

 

I have games that are two years old that I can't max out because they would use too much VRAM... 

 

Funny thing I've just started playing RE3 Remake and that uses "8GB" alone for textures... 

 

so I thought "hmm I'm gonna try anyway!" 

 

so with textures set to 8GB and everything else high or medium it uses south of 12GB or so... 

 

ok, apply the settings - - - >  insta crash lol

 

 

And is repeatable too.

 

(also the game warns you that this could happen if you go over the limit) 

 

 

Now, I do believe that *this* is also an example of bad optimization, but I have other games where that isn't as apparent and you can actually see the difference of "higher quality textures" except the game will lag and glitch out of you go over the limit, but usually not outright crash. 

 

So yeah, the "allocation" argument doesn't really hold true at all because the difference between "4GB" textures and "8GB" textures is clearly visible and therfore it isn't rocket surgery to figure out that higher resolution textures indeed use more VRAM. 

 

 

 

Of course it's also true that many games are still using low quality textures nowadays so people aren't necessarily running into these issues.

 

¯\_(ツ)_/¯

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/20/2020 at 9:51 PM, Consul said:

As a person who is thinking about buying a 3080 at some point, I'd like to ask a question.

I've been seeing most people say that 10GBs of VRAM is not enough for the near future (2021-2022)

I now have a 1080p monitor, I'll be using the 3080 with that first, and when I get the money, I might upgrade to a 1440p 144hz monitor.

 

So the questions are: Is 10GB's of VRAM that is in the 3080 enough? Does GDDR6X make up for the shortage of VRAM when compared to 2080ti's 11GBs of VRAM?

 

There only a few games right now that can hit 10GB or exceed it.  With this being said if you want to use this card at 3440x1440P or 4K for the next 5 years for the latest and greatest AAA title, than yeah that will become a problem.  If you are playing the SIMS or something basic like Fortnite than yeah 10GB should be more than enough even at 4K.  

SFF Time N-ATX V2 - Gigabyte X570 I Aorus Pro WIFI - AMD Ryzen 9 5800X3D - Gigabyte Gaming OC RTX 4090 - LG C2 OLED 42" 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I feel like I just had this discussion on another forum, so I'll cut out all the gunk that spawned from that discussion and move to the strong points.
 

  • For the moment, 10GB on the RTX 3080 is enough for 4K, but not higher resolutions than that.
  • 16GB for the RX 6000 cards is overkill.
    • 256-bit GDDR6 memory = slower and unable to take advantage of 4K in most situations.
    • Thus, it is nothing more than a marketing point for the foreseeable future.
  • The most "intense" games currently on the market use just over 9GB VRAM at 4K. 
    • Cyberpunk 2077, when maxed out at 4K without DLSS, utilizes around 9GB VRAM.
      • At these settings, the average framerate is around 5 FPS.
    • Doom Eternal, when maxed out at 4K, utilizes around 8.5GB VRAM
      • Most of this is cached and it will likely scale with lower VRAM (going to test this on my 2070 Super later today)
    • Grand Theft Auto 5, at 4K max settings with Frame Scaling mode set to 3/2 (1.5x resolution of 4k), it utilizes roughly 9.8GB VRAM
      • around 35 FPS average.
      • Playing GTA V at 4K on a 3080 without utilizing Frame Scaling mode yields 60+ FPS.
  • DLSS and other AI upscaling tech will help to mitigate this as it renders at a lower resolution, thus requiring less VRAM.

In several years, the 10GB VRAM on the 3080 "might" become an issue, but as has been pointed out already, new consoles have 16GB shared memory to be utilized for system, OS, and video. Both PS5 and Xbox Series X/S have quick start features that allow you to switch between games/applications quickly. This will also cost a decent chunk of memory to remain in place.

 

TL;DR

10GB of VRAM is fine at 4K for now. 8GB might not be enough at 4K but should be plenty at 1440P. 16GB VRAM is complete overkill considering that the RX 6000 cards don't scale very well beyond 1440P.

CPU: Ryzen 7 5800x3D || GPU: Gigabyte Windforce RTX 4090 || Memory: 32GB Corsair 3200mhz DDR4 || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G915 TKL || Mouse: Logitech G502 Lightspeed || PSU: EVGA 1300-watt G+ PSU || Case: Fractal Design Pop XL Air
 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/10/2021 at 5:46 PM, Nick Gurr said:

100% Not enough.

Yes it may be enough for now at 1440p, but not after next gen, currently some games already use more than 10gb of Vram at 1440p, imagine a few years later.

Do you still remember 5 years ago when people said "4gb is more than enough and 6gb is overkill?" look at us now.

Imagine 5 years later how much Vram games will use (My guess around 16gb). 

Even the AMD cards already have 16gb of Vram, but if you're planning on upgrading than it's no problem at all.

 

Currently using a EVGA FTW3 Ultra RTX 3080+Ryzen 7 5800x

This is true in many many areas of computer tech. Moore’s law effectively documented the progression of the wave front of a technology explosion.  Even if it no longer accurately does that, I don’t think the fuel has been used up yet.  Progression may not be as fast as it once was, but that doesn’t mean it’s slow.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, MadPistol said:

I feel like I just had this discussion on another forum, so I'll cut out all the gunk that spawned from that discussion and move to the strong points.
 

  • For the moment, 10GB on the RTX 3080 is enough for 4K, but not higher resolutions than that.
  • 16GB for the RX 6000 cards is overkill.
    • 256-bit GDDR6 memory = slower and unable to take advantage of 4K in most situations.
    • Thus, it is nothing more than a marketing point for the foreseeable future.
  • The most "intense" games currently on the market use just over 9GB VRAM at 4K. 
    • Cyberpunk 2077, when maxed out at 4K without DLSS, utilizes around 9GB VRAM.
      • At these settings, the average framerate is around 5 FPS.
    • Doom Eternal, when maxed out at 4K, utilizes around 8.5GB VRAM
      • Most of this is cached and it will likely scale with lower VRAM (going to test this on my 2070 Super later today)
    • Grand Theft Auto 5, at 4K max settings with Frame Scaling mode set to 3/2 (1.5x resolution of 4k), it utilizes roughly 9.8GB VRAM
      • around 35 FPS average.
      • Playing GTA V at 4K on a 3080 without utilizing Frame Scaling mode yields 60+ FPS.
  • DLSS and other AI upscaling tech will help to mitigate this as it renders at a lower resolution, thus requiring less VRAM.

In several years, the 10GB VRAM on the 3080 "might" become an issue, but as has been pointed out already, new consoles have 16GB shared memory to be utilized for system, OS, and video. Both PS5 and Xbox Series X/S have quick start features that allow you to switch between games/applications quickly. This will also cost a decent chunk of memory to remain in place.

 

TL;DR

10GB of VRAM is fine at 4K for now. 8GB might not be enough at 4K but should be plenty at 1440P. 16GB VRAM is complete overkill considering that the RX 6000 cards don't scale very well beyond 1440P.

Time to play “can you guess which thing is not like the others” with this.  Most of it I agree with.  There are a few points I think are less demosteratble.  One is not so much that  the 16gb on the AMD cards is overkill, so much as it produces a card with a longer working life. Historically this has been the case.  The 580 is also a card with “too much memory”.  The thing about the 580 though is not all applications have heavy hits on the same parts of the card.  The 580 was a slower card than the 980. It can run things the 980 can’t though.  They’re still selling 580s.  The 980 is long dead. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/20/2020 at 9:56 PM, FaxedForward said:

The main cause for concern is that both next-gen Xbox and PlayStation consoles are shipping with 16GB of VRAM,

Steve from GN's video on the comparison of graphics between PS5 and PC is illuminating here.

 

He seemed to find the PS5 is putting out GPU power about equivalent to a 1070ti on the lower end or a 1080 on the high end (which would also be in the 2070 non-super neighborhood) at least in terms of the grapics/textures that it can output.  This is important in that, these GPUs (and those in the XBOX and Ps5) are not going to be able to push textures that their GPU cant handle.  They dont seem to push textures that are super impressive (for current GPU abilities) and as such, developers wont be making games that require VRAM > 10 gb.  Why not?: because the PS5 cant actually run those kind of textures with a framerate that is going to be playable.   If they made a game requiring 15 gb VRAM at 1440p the PS5 would likely be running it at 10 FPS.  

 

The devs arent going to be focusing on making games with graphics/textures so crazy high that it makes the game unplayable on consoles.  Cyberpunk which is hard as hell to run on anything runs fine on 10 gb...at 4k.  Aside from games that are crazy modded, this is really not a concern at all in current gen gaming.  I still had not encountered a game that my 2080ti couldnt run at 4k due to VRAM limitations.

 

 

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/20/2020 at 9:51 PM, Moonzy said:

Download more RAM pfft

😂 careful now.  The problem with beautiful lies is people like to believe them so much that even a bad lie will work because people would rather be fooled by it than face the cold winds of reality.  Even the silliest most badly done magic tricks can work if someone wants badly enough to believe they are true. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Zberg said:

Steve from GN's video on the comparison of graphics between PS5 and PC is illuminating here.

 

He seemed to find the PS5 is putting out GPU power about equivalent to a 1070ti on the lower end or a 1080 on the high end (which would also be in the 2070 non-super neighborhood) at least in terms of the grapics/textures that it can output.  This is important in that, these GPUs (and those in the XBOX and Ps5) are not going to be able to push textures that their GPU cant handle.  They dont seem to push textures that are super impressive (for current GPU abilities) and as such, developers wont be making games that require VRAM > 10 gb.  Why not?: because the PS5 cant actually run those kind of textures with a framerate that is going to be playable.   If they made a game requiring 15 gb VRAM at 1440p the PS5 would likely be running it at 10 FPS.  

 

The devs arent going to be focusing on making games with graphics/textures so crazy high that it makes the game unplayable on consoles.  Cyberpunk which is hard as hell to run on anything runs fine on 10 gb...at 4k.  Aside from games that are crazy modded, this is really not a concern at all in current gen gaming.  I still had not encountered a game that my 2080ti couldnt run at 4k due to VRAM limitations.

 

 

The other part of this is it’s as I understand it (and I may not) 16gb shared.  The OS will use x amount, and the game will use y amount, and the two together need to equal 16gb.  The xbo and ps5 both use OSes that take less than 8gb, they’re both leaner than win10, and win10 can run itself and at least one other app generally on 8gb.  The machine will never have the full 16gb available, because there has to be at least some OS going, likewise it will very possibly have more than 8gb available for a game.  This may be why they have the whole game switching thing.  The total memory could be big enough to hold more than one app along with the OS.  There may be a limit on how much memory a programmer is allowed to use, in order to make more than one app fit in the memory.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×