Jump to content

How to download all pics from website at once?

Hi guys.

How could I download all pics at once from website videocardz.net ,and I meant each of the cards on the website, pics in full resolution (sample pic below)?

 

6c20359cebb93fe19572b90b0d19a3eb-1200x900.jpg

 

Please do not take offence for my apparent confusion or rudeness,it's not intent me to be like that,it's just my BPD,be nice to me,and I'll return twice better,be rude and usually I get easly pissed of...I'll try to help anyone here,as long as it's something I dealt with,and even if you think I'm rude or not polite,forgive me,  it's not me it's my BPD.

Thanks for understanding.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, frozensun said:

Hi guys.

How could I download all pics at once from website videocardz.net ,and I meant each of the cards on the website, pics in full resolution (sample pic below)?

 

6c20359cebb93fe19572b90b0d19a3eb-1200x900.jpg

Ctrl + A (but that is everything, including text, but you could just delete it)

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, filpo said:

Ctrl + A (but that is everything, including text, but you could just delete it)

I don't think you understand me...ctrl-a just selects the text and pics on website.

I want to download all pictures of the cards that are posted on the website.

 

Please do not take offence for my apparent confusion or rudeness,it's not intent me to be like that,it's just my BPD,be nice to me,and I'll return twice better,be rude and usually I get easly pissed of...I'll try to help anyone here,as long as it's something I dealt with,and even if you think I'm rude or not polite,forgive me,  it's not me it's my BPD.

Thanks for understanding.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, frozensun said:

I don't think you understand me...ctrl-a just selects the text and pics on website.

I want to download all pictures of the cards that are posted on the website.

use a browser extension like Imageye, Gallerify, ImageDrain or DownThemAll. Should all be on the microsoft edge addons store and chrome web store

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, filpo said:

use a browser extension like Imageye, Gallerify, ImageDrain or DownThemAll. Should all be on the microsoft edge addons store and chrome web store

Tried no successs.Downloads all what is currently on website with thumbnails.

 

Please do not take offence for my apparent confusion or rudeness,it's not intent me to be like that,it's just my BPD,be nice to me,and I'll return twice better,be rude and usually I get easly pissed of...I'll try to help anyone here,as long as it's something I dealt with,and even if you think I'm rude or not polite,forgive me,  it's not me it's my BPD.

Thanks for understanding.

Link to comment
Share on other sites

Link to post
Share on other sites

you'd have to build some software that scrapes the entire website for full size images.

 

note that doing this "at full speed" is quite frowned upon, because you're putting an exceptionally high load on their servers if you do this.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, frozensun said:

Hi guys.

How could I download all pics at once from website videocardz.net ,and I meant each of the cards on the website, pics in full resolution (sample pic below)?

 

6c20359cebb93fe19572b90b0d19a3eb-1200x900.jpg

Depends on what you want them for. Copyright issues aside, why not contact staff and have them sent to you. I'd bet you could get them for a small donation. Yes there are programs or scrips that will do that but as I mentioned, there are legal aspects to consider here. If you are looking for stock images of certain cards, you can get them from the manufacturers, either on their website or by contacting the press or PR dept.

Link to comment
Share on other sites

Link to post
Share on other sites

You could use a website downloader software ... personally I like Offline Explorer Pro / Enterprise

 

You need to create a project in the software and set some basic rules about what to scrape ( how many levels down should it go, if it should try to get files from outside domains/websites and if so you can specify which ones, what extensions it should get and so on and so on, it's very customizable)

 

In the case of this website, seems images are served by cdn.videocardz.net , so you'd just have to configure the project to retrieve content only from videocardz.net and cdn.videocards.net and not go to other websites when following links on pages.

 

// you can also configure it to download only one picture at a time, and to wait some amount of time between transfers (ex wait 2-3 seconds between each request)  - helps with some websites that throttle or limit you if you do more than N requests per second / minute ... you can just leave it over night to grab content, not everything has to be super fast.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, mariushm said:

You could use a website downloader software ... personally I like Offline Explorer Pro / Enterprise

 

You need to create a project in the software and set some basic rules about what to scrape ( how many levels down should it go, if it should try to get files from outside domains/websites and if so you can specify which ones, what extensions it should get and so on and so on, it's very customizable)

 

In the case of this website, seems images are served by cdn.videocardz.net , so you'd just have to configure the project to retrieve content only from videocardz.net and cdn.videocards.net and not go to other websites when following links on pages.

 

// you can also configure it to download only one picture at a time, and to wait some amount of time between transfers (ex wait 2-3 seconds between each request)  - helps with some websites that throttle or limit you if you do more than N requests per second / minute ... you can just leave it over night to grab content, not everything has to be super fast.

Ok thanks.Have you tried it,succed?

 

 

Please do not take offence for my apparent confusion or rudeness,it's not intent me to be like that,it's just my BPD,be nice to me,and I'll return twice better,be rude and usually I get easly pissed of...I'll try to help anyone here,as long as it's something I dealt with,and even if you think I'm rude or not polite,forgive me,  it's not me it's my BPD.

Thanks for understanding.

Link to comment
Share on other sites

Link to post
Share on other sites

First time this program did great job for me,but now when I start the project just like the same time it simply stops with following message (pic).

7 hours ago, mariushm said:

You could use a website downloader software ... personally I like Offline Explorer Pro / Enterprise

 

You need to create a project in the software and set some basic rules about what to scrape ( how many levels down should it go, if it should try to get files from outside domains/websites and if so you can specify which ones, what extensions it should get and so on and so on, it's very customizable)

 

In the case of this website, seems images are served by cdn.videocardz.net , so you'd just have to configure the project to retrieve content only from videocardz.net and cdn.videocards.net and not go to other websites when following links on pages.

 

// you can also configure it to download only one picture at a time, and to wait some amount of time between transfers (ex wait 2-3 seconds between each request)  - helps with some websites that throttle or limit you if you do more than N requests per second / minute ... you can just leave it over night to grab content, not everything has to be super fast.

If I only knew to use this program, the GUI is so complicated for what I need.

Screen Shot 04-08-23 at 07.00 PM.JPG

 

Please do not take offence for my apparent confusion or rudeness,it's not intent me to be like that,it's just my BPD,be nice to me,and I'll return twice better,be rude and usually I get easly pissed of...I'll try to help anyone here,as long as it's something I dealt with,and even if you think I'm rude or not polite,forgive me,  it's not me it's my BPD.

Thanks for understanding.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/8/2023 at 11:33 AM, mariushm said:

You could use a website downloader software ... personally I like Offline Explorer Pro / Enterprise

 

You need to create a project in the software and set some basic rules about what to scrape ( how many levels down should it go, if it should try to get files from outside domains/websites and if so you can specify which ones, what extensions it should get and so on and so on, it's very customizable)

 

In the case of this website, seems images are served by cdn.videocardz.net , so you'd just have to configure the project to retrieve content only from videocardz.net and cdn.videocards.net and not go to other websites when following links on pages.

 

// you can also configure it to download only one picture at a time, and to wait some amount of time between transfers (ex wait 2-3 seconds between each request)  - helps with some websites that throttle or limit you if you do more than N requests per second / minute ... you can just leave it over night to grab content, not everything has to be super fast.

Any tutorial on how to use this program for this particular task?

 

Please do not take offence for my apparent confusion or rudeness,it's not intent me to be like that,it's just my BPD,be nice to me,and I'll return twice better,be rude and usually I get easly pissed of...I'll try to help anyone here,as long as it's something I dealt with,and even if you think I'm rude or not polite,forgive me,  it's not me it's my BPD.

Thanks for understanding.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×