Jump to content

VGA vs HDMI

GeorgeKellow

The only reason to stick with VGA nowadays is if you have a CRT monitor or an old projector. Any half-decent TFT or LCD monitor since like 2004 should be run over DVI and/or HDMI if the video card has either output.

 

VGA (along with Composite & Component Video, particularly the RCA-connector types) is prone to signal degradation over distance and interference from nearby EM-generating devices, unless using a shielded cable (hint: most VGA & RCA cables are NOT shielded to save on costs).

DVI, HDMI & DisplayPort being purely-digital suffer no degradation or interference from outside sources - you get either all of the data or none of it if something is broken in the chain between the video output and the screen.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Technous285 said:

The only reason to stick with VGA nowadays is if you have a CRT monitor or an old projector. Any half-decent TFT or LCD monitor since like 2004 should be run over DVI and/or HDMI if the video card has either output.

 

VGA (along with Composite & Component Video, particularly the RCA-connector types) is prone to signal degradation over distance and interference from nearby EM-generating devices, unless using a shielded cable (hint: most VGA & RCA cables are NOT shielded to save on costs).

DVI, HDMI & DisplayPort being purely-digital suffer no degradation or interference from outside sources - you get either all of the data or none of it if something is broken in the chain between the video output and the screen.

90 procent of post-ussr countries use vga) to connet a computer to a monitor

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Technous285 said:

So? The OP is from ENGLAND, which means he's got a better selection of technologies to pick from, and one of those happens to be HDMI - He has no reason to stick with the ancient VGA standard in this case.

well isnt it better to use trusted vga)

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, linushellskitchentips said:

well isnt it better to use trusted vga)

No it's not.

 

That's like saying: "Nah, I don't want to use a computer, I'd be better off using a typewriter and an abacus because they are 'trusted'".

Athlon X2 for only 27.31$   Best part lists at different price points   Windows 1.01 running natively on an Eee PC

My rig:

Spoiler

Celeronator (new main rig)

CPU: Intel Celeron (duh) N2840 2.16GHz Dual Core

RAM: 4GB DDR3 1333MHz

HDD: Seagate 500GB

GPU: Intel HD Graphics 3000 Series

Spoiler

Frankenhertz (ex main rig)

CPU: Intel Atom N2600 1.6GHz Dual Core

RAM: 1GB DDR3-800

HDD: HGST 320GB

GPU: Intel Graphics Media Accelerator (GMA) 3600

 

Link to comment
Share on other sites

Link to post
Share on other sites

He's trolling. Don't fall for it.

 

Also I think Display Port and USB-c/Thunderbolt are the future for PC's. I never liked HDMI but it became standard rather quickly because of HDTVs.

Intel 4770k@4.6GHz, ASUS ROG Maximus VI Hero, Kingston HyperX Beast 2x8GB 2400MHz CL11, Gigabyte GTX 1070 Gaming, Kingston HyperX 3k 240GB - RAID0 (2x120Gb), 2xWD 1TB (Blue and Green), Corsair H100i, Corsair AX860, CoolerMaster HAF X, ASUS STRIX Tactic pro, Logitech G400S, HyperX Cloud II, Logitech X530, Acer Predator X34.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Djole123 said:

No it's not.

 

That's like saying: "Nah, I don't want to use a computer, I'd be better off using a typewriter and an abacus because they are 'trusted'".

vga atleast proved that it is reliable standard that can be trusted

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

while  hdmi didnt)

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, linushellskitchentips said:

vga atleast proved that it is reliable standard that can be trusted

 

2 minutes ago, linushellskitchentips said:

while  hdmi didnt)

Are you a troll, spammer, anything like that?

Athlon X2 for only 27.31$   Best part lists at different price points   Windows 1.01 running natively on an Eee PC

My rig:

Spoiler

Celeronator (new main rig)

CPU: Intel Celeron (duh) N2840 2.16GHz Dual Core

RAM: 4GB DDR3 1333MHz

HDD: Seagate 500GB

GPU: Intel HD Graphics 3000 Series

Spoiler

Frankenhertz (ex main rig)

CPU: Intel Atom N2600 1.6GHz Dual Core

RAM: 1GB DDR3-800

HDD: HGST 320GB

GPU: Intel Graphics Media Accelerator (GMA) 3600

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Djole123 said:

 

Are you a troll, spammer, anything like that?

nope)

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, linushellskitchentips said:

nope)

I don't wanna get any further into this.

 

HDMI is better. Period.

Athlon X2 for only 27.31$   Best part lists at different price points   Windows 1.01 running natively on an Eee PC

My rig:

Spoiler

Celeronator (new main rig)

CPU: Intel Celeron (duh) N2840 2.16GHz Dual Core

RAM: 4GB DDR3 1333MHz

HDD: Seagate 500GB

GPU: Intel HD Graphics 3000 Series

Spoiler

Frankenhertz (ex main rig)

CPU: Intel Atom N2600 1.6GHz Dual Core

RAM: 1GB DDR3-800

HDD: HGST 320GB

GPU: Intel Graphics Media Accelerator (GMA) 3600

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Djole123 said:

I don't wanna get any further into this.

 

HDMI is better. Period.

 

13 minutes ago, linushellskitchentips said:

nope)

I now have put an HDMI cable between my monitor and PC. The picture does look crisper. I put my vga cable back in and I can notice a slight blur 

Link to comment
Share on other sites

Link to post
Share on other sites

If you configure the output properly, HDMI will be better than VGA in any situation.

 

HDMI and DVI are digital, meaning there's no signal degradation between the video card and the monitor. What is sent by the video card is received by the monitor.

For every pixel on the screen, there's a series of bits sent to the monitor, 8 bits for every color component of the pixel (red, green, blue or YCrCr which means luminance and chroma components).  In addition, there's a few extra bits sent along with this information which contain error correction information, so in case some information is corrupted during transfer, the monitor can automatically use those bits to correct the information.

So we have a  8:10 encoding, for every 8 bits we have 10 bits sent, so for every pixel we have 3 colors x 8 bits per pixel = 24 bits + 6 bits for error correction = 30 bits. For 1920x1080 image, we have  1920x1080 x 30  = 62,208,000 bits and we multiply this by 60 updates a second and we have 3,732,480,000 bits per second or 466,560,000 bytes per second or 444 MB/s

 

VGA is an analogue signal, where for each pixel the video card sends a voltage through three wires, one for each color element (red, green and blue). The video card will adjust the voltage on each wire depending on the intensity of the color (for 8bit, we have 256 levels of intensity) so the voltage on the wire will vary between 0v and 0.7v in 256 levels. 

The video card has to be strong enough to adjust the voltage on those 3 wires for every pixel on the screen, every time the image refreshes.

 

So for example, if you want to update the screen 60 times (60 Hz), the video card has to push all those pixels to the monitor in 1000 ms / 60 = ~ 16.6 ms

In those 16.6 ms , the voltage on those three wires has to change between 0v and 0.7v up to 1920x1080 times, up to 2,073,600 times, so imagine there's only  8 microseconds ( ms > nanosecond > microsecond) of time for the video card to set the voltage on each of those 3 wires and have the monitor read back the voltage at the other end of the wire and convert it back to 8 bits of information.

In those 8 microseconds of time, if there's some electromagnetic radiation hitting the cable, if the insulation is not strong enough, it's such a short time that the voltage on that wire can slightly change enough that the monitor would convert the voltage to a value that may be slightly bigger or slightly smaller. So, you may see an ever so slightly brighter pixel and - if the cable is hit for a long enough period - you may even see that blurring effect which is caused by the external noise.

Ferrite beads on both ends of the cables help alleviate this radiation and other noise issue, having a proper thick vga cable with each color wire twisted with a ground wire and then also shielded separately inside the cable also would help, but usually the cables provided with lcd monitors these days are very thin and basic, the absolute minimum.

 

VGA worked great for lower resolutions and high refresh rates, like 1024x768 and 85-100 Hz ... there's simply more time for each pixel color on the cable, for example for 1024x768 and 100 Hz , the video card would have to send 786,432 pixels in 1000 ms / 100 = 10 ms or one pixel color information every 13 microseconds, almost twice the amount of time needed for 1080p 60 hz. Plenty of time for both video card and monitor to send a specific voltage through the wire and have it read at the other end.

 

Also, keep in mind that some older video cards had better quality and often more expensive RAMDACs (digital to analogue converters) for the VGA signal, on newer video cards manufacturers moved slowly to save a few cents by using highly integrated ramdacs that don't necessarily produce better quality but take less pcb space and don't need a lot of additional external parts. Quality on the VGA connector is no longer a priority, when there's DVI, HDMI and Displayport all digital sending a 1:1 image to the monitor, without quality loss.

 

Some new video cards don't even have the analogue circuitry anymore, no vga output, so even using DVI to VGA passive adapters you won't get any analogue signal that way. With those cards, you'd have to use ACTIVE adapters, which have a chip inside powered by energy provided in the HDMI connector, and that chip decodes the HDMI signal and produces the vga signal.

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, mariushm said:

If you configure the output properly, HDMI will be better than VGA in any situation.

 

HDMI and DVI are digital, meaning there's no signal degradation between the video card and the monitor. What is sent by the video card is received by the monitor.

For every pixel on the screen, there's a series of bits sent to the monitor, 8 bits for every color component of the pixel (red, green, blue or YCrCr which means luminance and chroma components).  In addition, there's a few extra bits sent along with this information which contain error correction information, so in case some information is corrupted during transfer, the monitor can automatically use those bits to correct the information.

So we have a  8:10 encoding, for every 8 bits we have 10 bits sent, so for every pixel we have 3 colors x 8 bits per pixel = 24 bits + 6 bits for error correction = 30 bits. For 1920x1080 image, we have  1920x1080 x 30  = 62,208,000 bits and we multiply this by 60 updates a second and we have 3,732,480,000 bits per second or 466,560,000 bytes per second or 444 MB/s

 

VGA is an analogue signal, where for each pixel the video card sends a voltage through three wires, one for each color element (red, green and blue). The video card will adjust the voltage on each wire depending on the intensity of the color (for 8bit, we have 256 levels of intensity) so the voltage on the wire will vary between 0v and 0.7v in 256 levels. 

The video card has to be strong enough to adjust the voltage on those 3 wires for every pixel on the screen, every time the image refreshes.

 

So for example, if you want to update the screen 60 times (60 Hz), the video card has to push all those pixels to the monitor in 1000 ms / 60 = ~ 16.6 ms

In those 16.6 ms , the voltage on those three wires has to change between 0v and 0.7v up to 1920x1080 times, up to 2,073,600 times, so imagine there's only  8 microseconds ( ms > nanosecond > microsecond) of time for the video card to set the voltage on each of those 3 wires and have the monitor read back the voltage at the other end of the wire and convert it back to 8 bits of information.

In those 8 microseconds of time, if there's some electromagnetic radiation hitting the cable, if the insulation is not strong enough, it's such a short time that the voltage on that wire can slightly change enough that the monitor would convert the voltage to a value that may be slightly bigger or slightly smaller. So, you may see an ever so slightly brighter pixel and - if the cable is hit for a long enough period - you may even see that blurring effect which is caused by the external noise.

Ferrite beads on both ends of the cables help alleviate this radiation and other noise issue, having a proper thick vga cable with each color wire twisted with a ground wire and then also shielded separately inside the cable also would help, but usually the cables provided with lcd monitors these days are very thin and basic, the absolute minimum.

 

VGA worked great for lower resolutions and high refresh rates, like 1024x768 and 85-100 Hz ... there's simply more time for each pixel color on the cable, for example for 1024x768 and 100 Hz , the video card would have to send 786,432 pixels in 1000 ms / 100 = 10 ms or one pixel color information every 13 microseconds, almost twice the amount of time needed for 1080p 60 hz. Plenty of time for both video card and monitor to send a specific voltage through the wire and have it read at the other end.

 

Also, keep in mind that some older video cards had better quality and often more expensive RAMDACs (digital to analogue converters) for the VGA signal, on newer video cards manufacturers moved slowly to save a few cents by using highly integrated ramdacs that don't necessarily produce better quality but take less pcb space and don't need a lot of additional external parts. Quality on the VGA connector is no longer a priority, when there's DVI, HDMI and Displayport all digital sending a 1:1 image to the monitor, without quality loss.

 

Some new video cards don't even have the analogue circuitry anymore, no vga output, so even using DVI to VGA passive adapters you won't get any analogue signal that way. With those cards, you'd have to use ACTIVE adapters, which have a chip inside powered by energy provided in the HDMI connector, and that chip decodes the HDMI signal and produces the vga signal.

 

ok ) ye anauloge has the minuses but  still vga cable is the easiest cable to find in belarus thats why people of post-ussr countries still use it

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, mariushm said:

If you configure the output properly, HDMI will be better than VGA in any situation.

 

HDMI and DVI are digital, meaning there's no signal degradation between the video card and the monitor. What is sent by the video card is received by the monitor.

For every pixel on the screen, there's a series of bits sent to the monitor, 8 bits for every color component of the pixel (red, green, blue or YCrCr which means luminance and chroma components).  In addition, there's a few extra bits sent along with this information which contain error correction information, so in case some information is corrupted during transfer, the monitor can automatically use those bits to correct the information.

So we have a  8:10 encoding, for every 8 bits we have 10 bits sent, so for every pixel we have 3 colors x 8 bits per pixel = 24 bits + 6 bits for error correction = 30 bits. For 1920x1080 image, we have  1920x1080 x 30  = 62,208,000 bits and we multiply this by 60 updates a second and we have 3,732,480,000 bits per second or 466,560,000 bytes per second or 444 MB/s

 

VGA is an analogue signal, where for each pixel the video card sends a voltage through three wires, one for each color element (red, green and blue). The video card will adjust the voltage on each wire depending on the intensity of the color (for 8bit, we have 256 levels of intensity) so the voltage on the wire will vary between 0v and 0.7v in 256 levels. 

The video card has to be strong enough to adjust the voltage on those 3 wires for every pixel on the screen, every time the image refreshes.

 

So for example, if you want to update the screen 60 times (60 Hz), the video card has to push all those pixels to the monitor in 1000 ms / 60 = ~ 16.6 ms

In those 16.6 ms , the voltage on those three wires has to change between 0v and 0.7v up to 1920x1080 times, up to 2,073,600 times, so imagine there's only  8 microseconds ( ms > nanosecond > microsecond) of time for the video card to set the voltage on each of those 3 wires and have the monitor read back the voltage at the other end of the wire and convert it back to 8 bits of information.

In those 8 microseconds of time, if there's some electromagnetic radiation hitting the cable, if the insulation is not strong enough, it's such a short time that the voltage on that wire can slightly change enough that the monitor would convert the voltage to a value that may be slightly bigger or slightly smaller. So, you may see an ever so slightly brighter pixel and - if the cable is hit for a long enough period - you may even see that blurring effect which is caused by the external noise.

Ferrite beads on both ends of the cables help alleviate this radiation and other noise issue, having a proper thick vga cable with each color wire twisted with a ground wire and then also shielded separately inside the cable also would help, but usually the cables provided with lcd monitors these days are very thin and basic, the absolute minimum.

 

VGA worked great for lower resolutions and high refresh rates, like 1024x768 and 85-100 Hz ... there's simply more time for each pixel color on the cable, for example for 1024x768 and 100 Hz , the video card would have to send 786,432 pixels in 1000 ms / 100 = 10 ms or one pixel color information every 13 microseconds, almost twice the amount of time needed for 1080p 60 hz. Plenty of time for both video card and monitor to send a specific voltage through the wire and have it read at the other end.

 

Also, keep in mind that some older video cards had better quality and often more expensive RAMDACs (digital to analogue converters) for the VGA signal, on newer video cards manufacturers moved slowly to save a few cents by using highly integrated ramdacs that don't necessarily produce better quality but take less pcb space and don't need a lot of additional external parts. Quality on the VGA connector is no longer a priority, when there's DVI, HDMI and Displayport all digital sending a 1:1 image to the monitor, without quality loss.

 

Some new video cards don't even have the analogue circuitry anymore, no vga output, so even using DVI to VGA passive adapters you won't get any analogue signal that way. With those cards, you'd have to use ACTIVE adapters, which have a chip inside powered by energy provided in the HDMI connector, and that chip decodes the HDMI signal and produces the vga signal.

 

thank you for the article

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Djole123 said:

VGA has no future. Like c'mon! VGA is chillin' around since the early 90's (and maybe even earlier)!

Yep. Introduced in 1987, replaced by DVI in 1999.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, GeorgeKellow said:

I've got a 21.5inch 1080p aoc monitor (Without speakers) and I have msi gtx 970

 

At the minute I am using VGA (DVI to vga adapter) Would I notice a benifit by using HDMI? 

 

If you're doing gaming with a decent VGA cable, No.

If you're doing any kind of design work or media creation - YES. I've compared them side to side and my monitor automatically calibrates its settings when connected over HDMI (And doesn't allow me to change anything except brightness/contrast). Colours look much more vibrant and the display is visibly crisper.

 

1 hour ago, A/C said:

He's trolling. Don't fall for it.

 

Also I think Display Port and USB-c/Thunderbolt are the future for PC's. I never liked HDMI but it became standard rather quickly because of HDTVs.

For me personally Displayport monitors and cables are simply too expensive :/ 

 

1 hour ago, linushellskitchentips said:

vga atleast proved that it is reliable standard that can be trusted

 

1 hour ago, linushellskitchentips said:

while  hdmi didnt)

VGA is only used because the standard has constantly been upgraded since the 80s, and because it is backwards compatible with ancient potato computers.

Speedtests

WiFi - 7ms, 22Mb down, 10Mb up

Ethernet - 6ms, 47.5Mb down, 9.7Mb up

 

Rigs

Spoiler

 Type            Desktop

 OS              Windows 10 Pro

 CPU             i5-4430S

 RAM             8GB CORSAIR XMS3 (2x4gb)

 Cooler          LC Power LC-CC-97 65W

 Motherboard     ASUS H81M-PLUS

 GPU             GeForce GTX 1060

 Storage         120GB Sandisk SSD (boot), 750GB Seagate 2.5" (storage), 500GB Seagate 2.5" SSHD (cache)

 

Spoiler

Type            Server

OS              Ubuntu 14.04 LTS

CPU             Core 2 Duo E6320

RAM             2GB Non-ECC

Motherboard     ASUS P5VD2-MX SE

Storage         RAID 1: 250GB WD Blue and Seagate Barracuda

Uses            Webserver, NAS, Mediaserver, Database Server

 

Quotes of Fame

On 8/27/2015 at 10:09 AM, Drixen said:

Linus is light years ahead a lot of other YouTubers, he isn't just an average YouTuber.. he's legitimately, legit.

On 10/11/2015 at 11:36 AM, Geralt said:

When something is worth doing, it's worth overdoing.

On 6/22/2016 at 10:05 AM, trag1c said:

It's completely blown out of proportion. Also if you're the least bit worried about data gathering then you should go live in a cave a 1000Km from the nearest establishment simply because every device and every entity gathers information these days. In the current era privacy is just fallacy and nothing more.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, linushellskitchentips said:

ok ) ye anauloge has the minuses but  still vga cable is the easiest cable to find in belarus thats why people of post-ussr countries still use it

I bet you guys all still use Windows 95 too because it's "the easiest to find and most trusted OS for people in post-ussbackwoodsr countries" right?

Link to comment
Share on other sites

Link to post
Share on other sites

well windows 95 is used only at most workplaces but at home most use windows 7 exept the ones w ho are concernd about privacy those ones use linux

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, linushellskitchentips said:

well windows 95 is used only at most workplaces but at home most use windows 7 exept the ones w ho are concernd about privacy those ones use linux

lol wtf

all businesses and instutions I've visited (which is a lot) use winxp and windows 7 & almost all home people I know have updated to windows 10

:)

Speedtests

WiFi - 7ms, 22Mb down, 10Mb up

Ethernet - 6ms, 47.5Mb down, 9.7Mb up

 

Rigs

Spoiler

 Type            Desktop

 OS              Windows 10 Pro

 CPU             i5-4430S

 RAM             8GB CORSAIR XMS3 (2x4gb)

 Cooler          LC Power LC-CC-97 65W

 Motherboard     ASUS H81M-PLUS

 GPU             GeForce GTX 1060

 Storage         120GB Sandisk SSD (boot), 750GB Seagate 2.5" (storage), 500GB Seagate 2.5" SSHD (cache)

 

Spoiler

Type            Server

OS              Ubuntu 14.04 LTS

CPU             Core 2 Duo E6320

RAM             2GB Non-ECC

Motherboard     ASUS P5VD2-MX SE

Storage         RAID 1: 250GB WD Blue and Seagate Barracuda

Uses            Webserver, NAS, Mediaserver, Database Server

 

Quotes of Fame

On 8/27/2015 at 10:09 AM, Drixen said:

Linus is light years ahead a lot of other YouTubers, he isn't just an average YouTuber.. he's legitimately, legit.

On 10/11/2015 at 11:36 AM, Geralt said:

When something is worth doing, it's worth overdoing.

On 6/22/2016 at 10:05 AM, trag1c said:

It's completely blown out of proportion. Also if you're the least bit worried about data gathering then you should go live in a cave a 1000Km from the nearest establishment simply because every device and every entity gathers information these days. In the current era privacy is just fallacy and nothing more.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, burnttoastnice said:

lol wtf

all businesses and instutions I've visited (which is a lot) use winxp and windows 7 & almost all home people I know have updated to windows 10

:)

windows 10 sucks its a piece of spyware that you should delete right away

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, linushellskitchentips said:

windows 10 sucks its a piece of spyware that you should delete right away

everything nowadays is spyware

youtube

google

microsoft

your phone

 

everything uses analytics and user preferences to force more ads upon you

so windows 10 is really the least of my concerns

Speedtests

WiFi - 7ms, 22Mb down, 10Mb up

Ethernet - 6ms, 47.5Mb down, 9.7Mb up

 

Rigs

Spoiler

 Type            Desktop

 OS              Windows 10 Pro

 CPU             i5-4430S

 RAM             8GB CORSAIR XMS3 (2x4gb)

 Cooler          LC Power LC-CC-97 65W

 Motherboard     ASUS H81M-PLUS

 GPU             GeForce GTX 1060

 Storage         120GB Sandisk SSD (boot), 750GB Seagate 2.5" (storage), 500GB Seagate 2.5" SSHD (cache)

 

Spoiler

Type            Server

OS              Ubuntu 14.04 LTS

CPU             Core 2 Duo E6320

RAM             2GB Non-ECC

Motherboard     ASUS P5VD2-MX SE

Storage         RAID 1: 250GB WD Blue and Seagate Barracuda

Uses            Webserver, NAS, Mediaserver, Database Server

 

Quotes of Fame

On 8/27/2015 at 10:09 AM, Drixen said:

Linus is light years ahead a lot of other YouTubers, he isn't just an average YouTuber.. he's legitimately, legit.

On 10/11/2015 at 11:36 AM, Geralt said:

When something is worth doing, it's worth overdoing.

On 6/22/2016 at 10:05 AM, trag1c said:

It's completely blown out of proportion. Also if you're the least bit worried about data gathering then you should go live in a cave a 1000Km from the nearest establishment simply because every device and every entity gathers information these days. In the current era privacy is just fallacy and nothing more.

 

Link to comment
Share on other sites

Link to post
Share on other sites

on my  computer i have adblock installed

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, burnttoastnice said:

everything nowadays is spyware

youtube

google

microsoft

your phone

 

everything uses analytics and user preferences to force more ads upon you

so windows 10 is really the least of my concerns

and by the way linux doesent spy on you like windows does

Main rig aka "Spatra"

cpu :Amd athlon x2    gpu:Palit gt 640  hdd:some toshiba hard drive that might die soon озу :4 gb case:some case from delux with custom paint

 

phones:  main one philips s396 backup samsung sgh-d780

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, burnttoastnice said:

If you're doing any kind of design work or media creation - YES. I've compared them side to side and my monitor automatically calibrates its settings when connected over HDMI (And doesn't allow me to change anything except brightness/contrast). Colours look much more vibrant and the display is visibly crisper.

HDMI would be better if the goddamn manufacturers allowed us to fully adjust the display output... It is the only reason why vga looks nicer. Also a vga cable is way more easier to plug in because the plug is big enough. HDMI and displayport are so small that if you a ½mm off it wont go in. Dvi is also good in this sense. But laptops. (i like to have big connectors, they are way easier to handle in every way) You can also determine what way the vga plug is without looking, because the edges are such clearly tilted. You also get more leverage out of the wider plug to fight against the cable. (seriously, how does it happen that the cable will always be twisted, because the port on your device is on a different angle than your display? illuminati?) I like dvi-i the most, because it is compatible with most potatos i have found. (analog+digital... WOW) And screws on the plug. Screw is a better solution than a springy flap, because it will not wear out as quickly, if you don't force it. I could continue this all day long, but im not bothered. Wait, i got a little distracted over the topic... oh well.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×