Jump to content

Is there a program that exists that does this?

Cvdasfg

I currently have three backups drives full of random stuff. I bought a bigger one and would like to put the three drives into one. However, I know there are a lot of copies of the same files across the three different drives. There are FAR to many folders/files to check manually. Is there a program that could find the unique folders/files of each one and create one large file from the three smaller HDD's without duplicates?

TX10 Build Log: http://linustechtips.com/main/topic/456229-tx10-build-log/

Case: TX10-D   Proccessor: i7-5820k   MotherBoard: Asrockx99 Extreme4   Ram: Crucial Ballistix Sport 16GB (DDR4-2400)   GPU: Asus Strix OC 980ti   Storage: 850pro 500gb, 850pro 500gb, 850pro 256gb, WD black 16tb total, Silicon Power S60 120GB   PSU: Seasonic snow silent 1050   Monitors: Three of Asus VG248QE 144Hz 24.0"

Link to comment
Share on other sites

Link to post
Share on other sites

I currently have three backups drives full of random stuff. I bought a bigger one and would like to put the three drives into one. However, I know there are a lot of copies of the same files across the three different drives. There are FAR to many folders/files to check manually. Is there a program that could find the unique folders/files of each one and create one large file from the three smaller HDD's without duplicates?

Yes this is a common problem.  Try Auslogics duplicate finder.  I believe it is free these days

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes this is a common problem.  Try Auslogics duplicate finder.  I believe it is free these days

 

 

CCleaner has a duplicate finder :)

 

 

Both of these would work if i could fit the contents of the 3 drives on to the one with the duplicates. But i cant..

TX10 Build Log: http://linustechtips.com/main/topic/456229-tx10-build-log/

Case: TX10-D   Proccessor: i7-5820k   MotherBoard: Asrockx99 Extreme4   Ram: Crucial Ballistix Sport 16GB (DDR4-2400)   GPU: Asus Strix OC 980ti   Storage: 850pro 500gb, 850pro 500gb, 850pro 256gb, WD black 16tb total, Silicon Power S60 120GB   PSU: Seasonic snow silent 1050   Monitors: Three of Asus VG248QE 144Hz 24.0"

Link to comment
Share on other sites

Link to post
Share on other sites

Both of these would work if i could fit the contents of the 3 drives on to the one with the duplicates. But i cant..

Auslogics will scan multiple drives at once.  Simply connect them all simultaneously, clean up the dups, and then you can consolidate the data onto one drive

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Both of these would work if i could fit the contents of the 3 drives on to the one with the duplicates. But i cant..

Of course you can.

Just create 3 folders.

Put the content of each drives in the separate folders on the same drive. Then run the duplicate finder on that one drive that contain everything.

Once that's done, you can put everything together.

 

Not to mention, CCleaner can scan on multiple drives to. (Tools > Duplicate finder)

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Of course you can.

Just create 3 folders.

Put the content of each drives in the separate folders on the same drive. Then run the duplicate finder on that one drive that contain everything.

Once that's done, you can put everything together.

They dont fit.. its 4TB to big lol.

TX10 Build Log: http://linustechtips.com/main/topic/456229-tx10-build-log/

Case: TX10-D   Proccessor: i7-5820k   MotherBoard: Asrockx99 Extreme4   Ram: Crucial Ballistix Sport 16GB (DDR4-2400)   GPU: Asus Strix OC 980ti   Storage: 850pro 500gb, 850pro 500gb, 850pro 256gb, WD black 16tb total, Silicon Power S60 120GB   PSU: Seasonic snow silent 1050   Monitors: Three of Asus VG248QE 144Hz 24.0"

Link to comment
Share on other sites

Link to post
Share on other sites

Of course you can.

Just create 3 folders.

Put the content of each drives in the separate folders on the same drive. Then run the duplicate finder on that one drive that contain everything.

Once that's done, you can put everything together.

 

Not to mention, CCleaner can scan on multiple drives

I think he was saying that he can't do exactly that because there would be no room

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Auslogics will scan multiple drives at once.  Simply connect them all simultaneously, clean up the dups, and then you can consolidate the data onto one drive

Makes sense ill give it a try :) thanks!

TX10 Build Log: http://linustechtips.com/main/topic/456229-tx10-build-log/

Case: TX10-D   Proccessor: i7-5820k   MotherBoard: Asrockx99 Extreme4   Ram: Crucial Ballistix Sport 16GB (DDR4-2400)   GPU: Asus Strix OC 980ti   Storage: 850pro 500gb, 850pro 500gb, 850pro 256gb, WD black 16tb total, Silicon Power S60 120GB   PSU: Seasonic snow silent 1050   Monitors: Three of Asus VG248QE 144Hz 24.0"

Link to comment
Share on other sites

Link to post
Share on other sites

ALLDup, tell it where to look and how to compare and it will find all the duplicates.

Link to comment
Share on other sites

Link to post
Share on other sites

It occurred to me that it might be wise to discuss some of the different options any of the many possible programs will likely offer, since by the sounds of things this is your first time doing something like this (OP).

 

When searching for duplicates, the first distinction you have to make (and thus have to set in the software) is what exactly you are comparing.  Are you just comparing a bunch of files and taking out multiple copies of files with the same name, or are you looking for genuine duplicates?  (I assume the latter).  By genuine duplicates, I mean files that have identical content, but may or may not have the same file name.  For this reason it is important to take note of options in the software that will affect this.

 

Different packages may also give you different scan options, from a byte by byte comparison, to hash comparisons, to something as simple as checking if two files have the exact same size and modification date, or something like that.  Picking this right one will depend on how much you value the accuracy of the results, and your time since choosing one is a trade-off between missing files/deleting non-duplicates, and taking an absolute eternity.  I believe most programs offer and default to some sort of hash comparison, which, though not absolutely infallible, are extremely trustworthy, but still complete in a reasonable amount of time.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

It occurred to me that it might be wise to discuss some of the different options any of the many possible programs will likely offer, since by the sounds of things this is your first time doing something like this (OP).

 

When searching for duplicates, the first distinction you have to make (and thus have to set in the software) is what exactly you are comparing.  Are you just comparing a bunch of files and taking out multiple copies of files with the same name, or are you looking for genuine duplicates?  (I assume the latter).  By genuine duplicates, I mean files that have identical content, but may or may not have the same file name.  For this reason it is important to take note of options in the software that will affect this.

 

Different packages may also give you different scan options, from a byte by byte comparison, to hash comparisons, to something as simple as checking if two files have the exact same size and modification date, or something like that.  Picking this right one will depend on how much you value the accuracy of the results, and your time since choosing one is a trade-off between missing files/deleting non-duplicates, and taking an absolute eternity.  I believe most programs offer and default to some sort of hash comparison, which, though not absolutely infallible, are extremely trustworthy, but still complete in a reasonable amount of time.

I have plenty of time to wait :) what is a program that can compare actual content/ is the most accurate?  I need to find genuine duplicates

TX10 Build Log: http://linustechtips.com/main/topic/456229-tx10-build-log/

Case: TX10-D   Proccessor: i7-5820k   MotherBoard: Asrockx99 Extreme4   Ram: Crucial Ballistix Sport 16GB (DDR4-2400)   GPU: Asus Strix OC 980ti   Storage: 850pro 500gb, 850pro 500gb, 850pro 256gb, WD black 16tb total, Silicon Power S60 120GB   PSU: Seasonic snow silent 1050   Monitors: Three of Asus VG248QE 144Hz 24.0"

Link to comment
Share on other sites

Link to post
Share on other sites

I have plenty of time to wait :) what is a program that can compare actual content/ is the most accurate?  I need to find genuine duplicates

 

A byte by byte comparison would in infallible, but that might take weeks... honestly, I don't even know.

 

Generally I believe the default algorithm is a hash combined with other info like filesize, etc. and is generally accepted to be good enough.  And I don't mean good enough as in less than 2% error or something, I mean it will even with that significantly faster algorithm virtually never make a mistake (as far as I know)

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×