Jump to content

Google is Discontinuing Unlimited Storage for All Education Customers

wall03
1 hour ago, comander said:

1. Storage needs/use grow in an almost exponentially. A few years from now people will be using 2x the capacity.

So they DO need it, or at least they will.

1 hour ago, comander said:

2. That 1% might just be wrecklessly using the "free" storage. Imagine a film school storing 1TB per student across 10000 students. 

Good luck uploading that in a reasonable time... even at gigabit speeds it would take over 2.5 years of uninterrupted upload. Surely they would just share redundant material among all students, for their own convenience if for no other reason.

 

Regardless, even if they did that it would be a microscopic drop in the bucket for google. In a conservative guesstimate from 8 years ago xkcd placed the capacity of Google's datacenters to around 15HB. I think it's safe to assume that has increased by at least two orders of magnitude since then considering just in 2020 Google spent about as much money on datacenters as they had between 1998 and 2013 and storage price per GB has fallen by a factor of 30 since then. If 1% of all universities on the planet used 10PB of storage Google could store it 10 times over and have plenty to spare.

1 hour ago, comander said:

3. Reducing associated bandwidth costs... 

Reducing capacity doesn't automatically reduce bandwidth usage, in fact it may increase it due to the need of swapping out data more frequently.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, GDRRiley said:

20,000 people sharing 100tb hmm thats .2gb per users, which is more than my school gives for our onsite storage at 100mb per student which is useless.

 

heh 20k people, if only that "few" 🙃

 

I forget the default home drive quotas for staff and students but standard network shares the first 50GB is free.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, comander said:

That 1% might just be wrecklessly using the "free" storage. Imagine a film school storing 1TB per student across 10000 students. 

We have media design courses, total usage at the end of this year was 180TB and that was for only around 100 students (think little bit more but either way not ~10k). At the end of the year that data is archived to tape and deleted.

 

This storage array is going to be more than doubled for this year with ongoing expansion plan for next year to around 600TB. Right now the lack of storage impacts the courses and hampers what they can do. This storage system is just for the media design department, they don't use our central ITS storage.

 

1TB per student is much lower than what is actually required to run such courses.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, leadeater said:

LOL! We have singular network shares larger than 100TB, totally useless.

My previous previous place of work had 1 PetaByte just for finance team of only 10 people...

 

I asked my boss why?! he said... don't ask he doesn't even know why as well. 😂🤣

CPU: AMD Ryzen 5 5600X | CPU Cooler: Stock AMD Cooler | Motherboard: Asus ROG STRIX B550-F GAMING (WI-FI) | RAM: Corsair Vengeance LPX 16 GB (2 x 8 GB) DDR4-3000 CL16 | GPU: Nvidia GTX 1060 6GB Zotac Mini | Case: K280 Case | PSU: Cooler Master B600 Power supply | SSD: 1TB  | HDDs: 1x 250GB & 1x 1TB WD Blue | Monitors: 24" Acer S240HLBID + 24" Samsung  | OS: Win 10 Pro

 

Audio: Behringer Q802USB Xenyx 8 Input Mixer |  U-PHORIA UMC204HD | Behringer XM8500 Dynamic Cardioid Vocal Microphone | Sound Blaster Audigy Fx PCI-E card.

 

Home Lab:  Lenovo ThinkCenter M82 ESXi 6.7 | Lenovo M93 Tiny Exchange 2019 | TP-LINK TL-SG1024D 24-Port Gigabit | Cisco ASA 5506 firewall  | Cisco Catalyst 3750 Gigabit Switch | Cisco 2960C-LL | HP MicroServer G8 NAS | Custom built SCCM Server.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Sir Asvald said:

My previous previous place of work had 1 PetaByte just for finance team of only 10 people...

If you said BI team I'd understand, finance though? Weird. Guess they were also doing larger data analytics as well then 🤷‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, leadeater said:

If you said BI team I'd understand, finance though? Weird. Guess they were also doing larger data analytics as well then 🤷‍♂️

Come to think of it, I do remember having a discussion about Power BI in one of our meetings. But it for the sales team they might have joined in with finance.. That would make sense. I left just before they started implementing it.

CPU: AMD Ryzen 5 5600X | CPU Cooler: Stock AMD Cooler | Motherboard: Asus ROG STRIX B550-F GAMING (WI-FI) | RAM: Corsair Vengeance LPX 16 GB (2 x 8 GB) DDR4-3000 CL16 | GPU: Nvidia GTX 1060 6GB Zotac Mini | Case: K280 Case | PSU: Cooler Master B600 Power supply | SSD: 1TB  | HDDs: 1x 250GB & 1x 1TB WD Blue | Monitors: 24" Acer S240HLBID + 24" Samsung  | OS: Win 10 Pro

 

Audio: Behringer Q802USB Xenyx 8 Input Mixer |  U-PHORIA UMC204HD | Behringer XM8500 Dynamic Cardioid Vocal Microphone | Sound Blaster Audigy Fx PCI-E card.

 

Home Lab:  Lenovo ThinkCenter M82 ESXi 6.7 | Lenovo M93 Tiny Exchange 2019 | TP-LINK TL-SG1024D 24-Port Gigabit | Cisco ASA 5506 firewall  | Cisco Catalyst 3750 Gigabit Switch | Cisco 2960C-LL | HP MicroServer G8 NAS | Custom built SCCM Server.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah ok then..

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, LogicalDrm said:

@wall03, your opener here is lacking personal input required for Tech News posts. Please fix;

If not corrected, this thread will be moved to another, suitable subforum, probably to Software.

fixed

please quote me or tag me @wall03 so i can see your response

motherboard buying guide      psu buying guide      pc building guide     privacy guide

ltt meme thread

folding at home stats

 

pc:

 

RAM: 16GB DDR4-3200 CL-16

CPU: AMD Ryzen 5 3600 @ 3.6GHz

SSD: 256GB SP

GPU: Radeon RX 570 8GB OC

OS: Windows 10

Status: Main PC

Cinebench R23 score: 9097 (multi) 1236 (single)

 

don't some things look better when they are lowercase?

-wall03

 

hello dark mode users

goodbye light mode users

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, comander said:

The smallest university I attended had something like 35k. 

That's the size of the one I work at with about 6500 staff as well.

 

Honestly though it's a bit more complicated than just looking at data storage amounts and network bandwidths. Cloud providers QoS inbound and outbound data at the network layer and storage larger and unless you are paying extra for premium tier storage none of them will let you upload data at 100Gbps. Then on top of that we backup data in cloud providers back to on-prem storage and egress data is even more heavily restricted. Throwing all you data in to a cloud provider and then going "yay all done and it's protected" isn't good enough, even if you use geo-replicated options as any problems will just replicate and it's gone from both. Any snapshot options included or "legal hold" that you can wrangle in to a backup-ish purpose is not a backup either.

 

Getting data in to and out of the cloud is highly problematic because none of them care about "you", they care about "everyone" but that means "you" are not important.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, comander said:

It does if it results in the media department setting up network shares for projects instead of just relying on cloud storage. 

If they really need 10PB of permanently stored data per year surely you can see how that could be a problem.

13 minutes ago, comander said:

Case in point ^

@leadeateris talking about on site storage as far as I can tell. And as they said, all that stuff just gets backed up on tape and archived rather than left to rot indefinitely on google cloud storage.

20 minutes ago, comander said:

Assume a university with 100Gbps upload aggregate (not rare). https://research.computing.yale.edu/services/high-speed-science-network

That's INTERNAL bandwidth. At best I'd expect the external connection to the internet to be 10Gbps though maybe @leadeatercan fact check me on that.

Spoiler

Science%20Network.png

10Gbps is obviously much faster than 1Gbps but not enough to make uploading 10PB through it every year very practical. Plus you need to account for imperfect wireless coverage, students not being on campus 24/7 etc. Not to mention a university where 10k students every year take a course that requires that much data is probably not that common an occurrence.

29 minutes ago, comander said:

If each student is on campus 800 days over the span of 4 years

I was considering a span of a single year since as we've established it's not uncommon for a media student to use around that much per year. That means that your estimate turns to about 50% network time used just to upload things to Google cloud services. And that's assuming you live on campus and can just leave your computer uploading overnight over the campus' network, which is not a given. While it's theoretically possible it's not something that people just accidentally do just because they have unlimited cloud storage; if at all possible they'll avoid redundant data and mostly upload things they need to share outside campus.

34 minutes ago, comander said:

For what it's worth, I'm just doing fermi-style calculations. All I care about is being within a factor of 10 or so... I might be off on the quantity of uploads and the number of students but ideally the errors cancel. I could easily see NYU having 1000 film students. 

Originally you mentioned 10k students uploading 1TB each. If you change that to 1k the matter changes entirely...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Oh crap. My edu account has over 120TB in it.

Spoiler

Quiet Whirl | CPU: AMD Ryzen 7 3700X Cooler: Noctua NH-D15 Mobo: MSI B450 TOMAHAWK MAX RAM: HyperX Fury RGB 32GB (2x16GB) DDR4 3200 Mhz Graphics card: MSI GeForce RTX 2070 SUPER GAMING X TRIO PSU: Corsair RMx Series RM550x Case: Be quiet! Pure Base 600

 

Buffed HPHP ProBook 430 G4 | CPU: Intel Core i3-7100U RAM: 4GB DDR4 2133Mhz GPU: Intel HD 620 SSD: Some 128GB M.2 SATA

 

Retired:

Melting plastic | Lenovo IdeaPad Z580 | CPU: Intel Core i7-3630QM RAM: 8GB DDR3 GPU: nVidia GeForce GTX 640M HDD: Western Digital 1TB

The Roaring Beast | CPU: Intel Core i5 4690 (BCLK @ 104MHz = 4,05GHz) Cooler: Akasa X3 Motherboard: Gigabyte GA-Z97-D3H RAM: Kingston 16GB DDR3 (2x8GB) Graphics card: Gigabyte GTX 970 4GB (Core: +130MHz, Mem: +230MHz) SSHD: Seagate 1TB SSD: Samsung 850 Evo 500GB HHD: WD Red 4TB PSU: Fractal Design Essence 500W Case: Zalman Z11 Plus

 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, wall03 said:

fixed

You added couple more words and call it fixed? You are not within spirit nor letter of the rules. Moved to Programs, Apps and Websites.

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Sauron said:

@leadeateris talking about on site storage as far as I can tell. And as they said, all that stuff just gets backed up on tape and archived rather than left to rot indefinitely on google cloud storage.

Yep, you aren't going to video edit from cloud storage although you can if local sync copy is kept for current data like OneDrive can do, assume GDrive has similar. But if you copy this student data in to a cloud provider for data protection then you'll be using a lot and if you don't remove cloud data copies then Google has to carry a hell of a lot of data even if you have good data management policies for local data.

 

Like I'm not at all surprised Google has done this, unlimited storage is essentially committing to filling the entire universe with storage devices, you really do need to have defined limits so everyone has a clear understanding and expectations.

 

24 minutes ago, Sauron said:

That's INTERNAL bandwidth. At best I'd expect the external connection to the internet to be 10Gbps though maybe @leadeatercan fact check me on that.

Depends, we have 6x 100Gbps for all data that stays inside the REANNZ network and the wider international academic and research networks. If the cloud provider has an access node within the network then you'll have as much bandwidth as they allow and you have available. Traffic going outside the REANNZ network is as fast as we are willing to pay for up to technical capabilities of REANNZ (more than we need), I think we are paying for 2Gbps but may be more as it was increased recently either to that 2Gbps or something higher than that.

 

Google, AWS and Azure have access nodes (network peering) in all the major internet exchanges in NZ and so do we so all traffic to and from these providers and anyone else will be at 100Gbps (well whatever they are peering at blah blah etc).

 

It's pretty well moot though as none of them will allow you to ingress or egress those kind of data rates on standard offerings, nothing part of O365 A1 through to A5 (we have A5). You'd have to pay for Expressroute and a premium service offering within Azure that also allows it. Since there is no Azure DC in NZ yet Expressroute cannot actually provide an improved service, just an NZ problem though.

 

All day every day, 24/7/365, we are hitting Azure and O365 QoS limits for each service we use so it's actually impossible to really utilize more of it. While a lot of the QoS rules are per user we still need to get the data back out for backups and we already have a rotation of 20 accounts to do this which is the maximum allow in the backup software we use without configuring another new O365 backup client with it's own servers etc etc.

 

Did a mention working with cloud sucks?

 

24 minutes ago, Sauron said:

Not to mention a university where 10k students every year take a course that requires that much data is probably not that common an occurrence.

Yep certainly not, such is why media design department have their own storage because they disproportionally use more per student than anyone else and they need to ensure high performance so best way to do that is a purpose system just for them. They are large enough it makes enough sense to do so, economy of scale.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×