Jump to content

I have a question about what causes excel to slow down

My job and department uses excel daily and we normally use the same file for at least a month. Within that duration, the file would accumulate a few thousand of rows(Sometimes 10k, other times up to 30k or even 50k). It would also have multiple sheets with different pivots and computations. My question is, whenever, the excel slows down due to these increase of data, what could be the cause of this? Is it simply the limitation of the app itself or does upgrading the cpu would help? Thanks!

Link to comment
Share on other sites

Link to post
Share on other sites

What kind of machines are you running this on?

 

Excel isn't perfect. I mean you hit it with enough calculations and it does take some processing time to compute everything. I mean 50k rows shouldn't completely slow things down a lot, but another question is, where do you store it? If it's on a network shared space and your tech group are security heavy, then that's probably it. If it's a "shared" file where multiple people can use it simultaneously, like an Access file, then that will also decrease performance.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, johnt said:

What kind of machines are you running this on?

 

Excel isn't perfect. I mean you hit it with enough calculations and it does take some processing time to compute everything. I mean 50k rows shouldn't completely slow things down a lot, but another question is, where do you store it? If it's on a network shared space and your tech group are security heavy, then that's probably it. If it's a "shared" file where multiple people can use it simultaneously, like an Access file, then that will also decrease performance.

 

23 minutes ago, micha_vulpes said:

Some of this is on excel. ( I am still on 2007) but at that time it was still only primarily running one core and not really multi threaded.
Faster IPC will always help excel run faster. If you have a large, iterative workbook thats full of a dozen sheets that all cross reference, use macros, and lots of iterative calculations its going to get slow, sometimes really slow especially if you have tens of thousands of data points.

 

It can help to copy the working file into a ramdrive, then open it from there to eliminate any drive or network access related slow downs.

It's not at crawling speed yet, but you could definitely notice the speed. The original file is actually in sharepoint, and shared by the whole team. It's actually tied to ms forms as well(how new data are encoded). And since we are dealing with applicant's personal details, it's really very security heavy. And yes, I kind of already accepted that when there's multiple users who both edit rows/columns and encode new info, it tends to slow down. But I also do the reporting and download a copy on my pc at home with 10600k. It's still a bit slow to my liking, when I filter, create new pivot, formula and macros. What really surprises me is that, at least in my experience, the conditional formatting settings down on the original file, is the main factor or the slowness and clearing the rules on the whole sheet, improves the responsiveness by a lot.

I'm actually been thinking of upgrading and one of the main factor would be this, if I would just stick with ryzen 5/i5 or go with r7/i7. I also use macros and even kutools(you probably heard of them). On other reports where I consolidate files and remove almost hundred thousand duplicates, normally takes a while, like 30mins or up to an hour. I guess the only good thing about that 30mins to an hour wait time is that I get dish out my steam deck and play. When I'm at the office or if my superior calls me when I'm working at home, I can honestly just say that if I use the pc while it deletes the duplicates, it would take even longer lol 

Link to comment
Share on other sites

Link to post
Share on other sites

- more formula's means more to recalculate each time you make a change (some formulas are especially heavy, excel will warn you if you use those.)

- formatting rules - this is an especially dangerous one because copypasting can in some cases create duplicate rules, and then quadruple rules, etc. you see where this goes.

- assume that other "active elements" add their own load to the file.

- overall size of the file does have an impact too, but this should be relatively minor compared to other things.

47 minutes ago, micha_vulpes said:

but at that time it was still only primarily running one core and not really multi threaded.

last time i used excel 2019, all the math was still on a single core. 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, manikyath said:

last time i used excel 2019, all the math was still on a single core. 

Since 2016 Excel is able to multicore some calculations but i am guessing it's a niche case which i never been able to see happen at large scale enough that i notice it.

 

7 hours ago, kitnoman said:

The original file is actually in sharepoint

Sharepoint handles MSOffice files differently. It managed the network traffic over old office version and newer 365 versions. The updates are still happening as a single pipe. Which mean if 2 people add data they are add one after each other and not simultaneously. Although formatting/conditions/macro/formula will not always update for each modifications. If a huge income of changes happens all at once some update might be delayed to prevent running them more than once and slowing for more than it need to.

 

That being says, a huge amount of formatting mean this is stored in the file itself and is not client side. The reason Database exist is that formatting does not coexist with the data because you want the data to be as fast and as most available as possible. This allow the client to take care of the visual formatting and therefore no one is slowing down anyone in the chain.

 

One solution is to change to an actual database file. You have sharepoint (i am guessing at least version 2019) and you can use SQL Express databases and hook sharepoint forms the same way you have your current forms hooked to the excel file. but you need new forms to display the results and you need to set the formatting logic/conditions in the forms fields/datagrid

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Franck said:

Since 2016 Excel is able to multicore some calculations but i am guessing it's a niche case which i never been able to see happen at large scale enough that i notice it.

 

explain to me then why my huge-arse excel sheet pegged a singular core at 100% for 15 minutes while doing math? (and yes.. 15 mintues.. it was thousands upon thousands of nested formula's)

 

"microsoft says" doesnt necessarily match with "microsoft does".

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, manikyath said:

"microsoft says" doesnt necessarily match with "microsoft does".

At best, each sheet might be able to use a separate core if it does not have any dependencies on other tabs. I don't see how Excel can determine if one sheet can use simultaneous calculations, especially with nested equations. There is a lot of linearity in its layered results method, where you can use a calculation to determine a result, and then use that result somewhere else. One has to be performed before the other. So splitting the calculations doesn't make sense.

 

Or you can open multiple instances of Excel now. I wonder if each file can utilize a different core during heavy calcs. Something to look into I guess.

 

Also, I can see Excel using multiple cores for bg processes so the interface is a little faster while you're using it.

 

Who knows!

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, manikyath said:

explain to me then why my huge-arse excel sheet pegged a singular core at 100% for 15 minutes while doing math? (and yes.. 15 mintues.. it was thousands upon thousands of nested formula's)

 

"microsoft says" doesnt necessarily match with "microsoft does".

Like i said i have never been able to see it happen where it was noticeable.

 

Anyhow thousands and thousands of formula is fast to do (couple seconds). Just not for excel. In excel you need to rely on tricks to make it faster by delaying specific updates, forcing update order and more.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, johnt said:

 

Or you can open multiple instances of Excel now. I wonder if each file can utilize a different core during heavy calcs. Something to look into I guess.

office doesnt even do that for the presenting of your document (even in word, etc.), so i doubt this exists.

 

fun sidenote, if one document crashes, it has a tendency to take everything else with it, because it's all on one thread.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, manikyath said:

fun

That doesn't sound fun at all lol

 

tbh I can't remember a software crash on my work devices that wasn't caused by the intense security measures to protect our data and privacy, while Microsoft mines it and sells it anyway

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Franck said:

That being says, a huge amount of formatting mean this is stored in the file itself and is not client side. The reason Database exist is that formatting does not coexist with the data because you want the data to be as fast and as most available as possible. This allow the client to take care of the visual formatting and therefore no one is slowing down anyone in the chain.

 

One solution is to change to an actual database file. You have sharepoint (i am guessing at least version 2019) and you can use SQL Express databases and hook sharepoint forms the same way you have your current forms hooked to the excel file. but you need new forms to display the results and you need to set the formatting logic/conditions in the forms fields/datagrid

 

How I wish I  can just use access or database. But the decision to make it happen is above my paygrade. We are not a tech company and I'm in our recruitment and sourcing department. So everyone are good at what they do, but probably would take a few training and time to adjust, which means cost and drop in productivity.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/29/2023 at 4:24 AM, kitnoman said:

How I wish I  can just use access or database. But the decision to make it happen is above my paygrade. We are not a tech company and I'm in our recruitment and sourcing department. So everyone are good at what they do, but probably would take a few training and time to adjust, which means cost and drop in productivity.

At some point its must be decided anyway. To work on something that is slow, prone to dataloss and crashes and not made to what its apparently used for. Or to move on something that can be "of the shelf" or custom made for the purpose. Having some downtime during training, but in the end, being more effective, secure and safe. Company not being "tech" is such a weird argument to be made. You don't need to be tech-anything to use tools that are made to increase productivity. You could present issues, solutions, cost/time savings gained etc. to your superior and see how they think of things.

 

So, somethings that just give me chills. I've worked in two companies that utilize heavily Excel as shared platform. One had network drive, the other (current) uses Sharepoint. Both have had several moments of dataloss, all from multiple users using same file directly. With network drive issue was that some users left the file open after their shift ended. Meaning that at worst, no one could do changes or even open the file. Some times closing the file in wrong way caused data to corrupt and backup to be used instead. This happened multiple times during the 9 month period, corruption 1-4 times, issues from multiple users weekly. The current Sharepoint method is bit better as file is usually able to handle multiple users. I say usually since we have had corruptions couple of times. Conclusion being that opening file on desktop app can make it so that others can't see changes or open the file.

 

With that, I would say one of the bigger reasons for any slowness is multiple simultaneous users. But just amount of data too. 10k rows is much. Same is lesser rows, but calculations happening all over the files.

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/28/2023 at 9:24 PM, kitnoman said:

How I wish I  can just use access or database. But the decision to make it happen is above my paygrade. We are not a tech company and I'm in our recruitment and sourcing department. So everyone are good at what they do, but probably would take a few training and time to adjust, which means cost and drop in productivity.

 

 

Most people tend to forget that small and recurring slow down adds up to alot overtime. Let you only run this hours long report twice a week and you have 2 dozen of slow down a 15 sec.  lets says your a team of 5 that experience the slow down but only you do those reports.

 

Over a year that is :


Waiting for report : 52 weeks x (2 report x 1 hour each) = 104hr
Waiting for the file : 250 days of work x 24 slow down x 5 people x 15 sec wait = 125 hr


Total : 229 hrs wasted per year

 

Estimated professional rework : 8 weeks with safety (320 hrs)
ROI : ~20 months

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×