Search the Community
Showing results for tags 'hd graphics'.
-
Hello, I've posted before about my willing to build a computer and i built it now. It has i5 11600k, msi z590 pro wifi, dual viper elite II 3600mhz ddr4 ram, xpg 8200 pro SSD for OS, Gamepower Skadi 240mm AIO, XPG PYLON650B-BKCEU 80+ 650W PSU. I didn't buy any GPU for now, i am waiting for discounts for a while. In that time i wanted to play some game which UHD 750 can handle. I downloaded Call of Duty Modern Warfare (not remastered one), which should be easy to play. Actually it plays when working smoothly. But usually it crashes, every time. The screen gets red vertical line, keyboard mouse reset button on the case nothing responding. I tried XMP on off, auto overclock stuffs on off, updated everything, used different drivers for UHD 750. But it still continues. (I'm playing on 1440*900 monitor not even 1080p) I was really hyped before but my mod fell down now, do you have any ideas about it? Best Regards *The images taken by mobile phone, there is something on event viewer but i couldn't find useful stuffs about it.
-
Intel's integrated graphics solutions spawned out of a simple need from the business/corporate world: to have enough graphics power to host a desktop, surf the internet, and write up documents/view presentations. This was a simple task, and Intel found it had extra unused die space which could be utilized for it, but the graphics engineering team was never given more than what was deemed "unused space." This is why the power of the integrated graphics processor has been weak, but there are more fundamental differences between Intel's HD Graphics architecture and Nvidia's Kepler/Maxwell or AMD's Hawaii. For simple instance, Intel's graphics architecture does not possess a lone multiply instruction in its SIMD units. Nvidia and AMD both implement the ability to simultaneously process 1 fused multiply-add instruction and 1 multiply instruction. To put this in perspective, HD Graphics 5300 possesses 24 EUs, each made of 2 SIMD units, each hosting 4 32-bit floating point units (also capable of integer calculations), each FPU capable of just 1 FMA instruction. To calculate FLOPS, we use the formula EUs * SIMD/EU * FPU/SIMD * (FMA + multiply instructions) * hertz. HD 5300 = 24 * 2 * 4 * (1 multiply + 1 Add) * 850 * 10^6 = 326.4 Gigaflops. If Intel implemented a lone multiply instruction as well, the total flops would be 192 * 3 * 850*10^6 = 489.6 Gigaflops, or a 50% increase in throughput capability. On more complex notes, Intel's graphics architecture does not possess a hardware-level tessellation or polygraph engine, technology very often used in games nowadays. Of course, Intel did not develop its integrated graphics for games, so it's not surprising for them to lack this technology currently. Currently Intel's best graphics solution, Iris Pro 5200, possesses 40 EUs and runs at 832 Gigaflops. If Intel implements a simultaneous multiply instruction, the total calculating power of this SKU would rise to 1.248 teraflops, a hefty sum for an integrated graphics processor. The number for Iris Flops is somewhat startling, because this means the count of SIMD units per core and FPUs per SIMD is the exact same between Gen 7.5 and Gen 8. This is a bit annoying personally considering how much die area was gained in the shrink from 22nm to 14nm. Intel has promised Gen 8 graphics (Broadwell, Braswell, and Cherry Trail iGPUs) will provide 20% more cores per tier (48 vs. 40 on the next Iris Pro 6200 SKUs for example), better tessellation performance, more cache per GPU core (not the 128MB eDRAM LLC, but the L1/L2 GPU cache), increased pixel fill rate, and some unknowns. This paragraph's information is courtesy of the Motley Fool. Furthermore, Gen 8 has successfully provided double precision and half precision support. Shared virtual memory has also been provided (making programmers' lives much easier). The number of EUs per subslice (a compute group sharing a sampler and data port, where more EUs sharing reduces throughput) has decreased from 10 to 8. Intel has added a 3rd subslice to each slice (collection of computing groups), allowing more variations of GPU configuration. Lastly, local bandwidth from cores to L3 cache has been improved for better performance (The Compute Architecture of Intel Integrated Graphics Gen8 pg5). While this is impressive and shows Intel's hard work, some unfortunate facts remain: Intel has next to no patents on 3D graphics rendering or architecture. The lion's share is possessed by Nvidia, AMD, Qualcomm, ARM Holdings, and Imagination Technologies. Currently Nvidia is allowing Intel access to the bottom of the barrel graphics intellectual property it has in exchange for some of Intel's CPU-based IP which is integrated into Nvidia's Tegra processors. However, it's far from likely that the discrete GPU duopoly of AMD and Nvidia will ever see another competitor in that realm, even from Intel, as long as their patents hold and are not forced to be made publicly usable. Even if you despise the idea of Chipzilla taking over the graphics market, it's hard to argue with the fact the GPU market has all but stagnated the last 3 years, and Nvidia could do with stronger, more well-funded competition. That is not to dismiss AMD, but rather to say AMD does not currently possess the finances to actually force innovation from Nvidia at this time. Intel's integrated graphics solutions are not particularly strong for gaming, but they are very good for accelerating scientific computing and driving basic graphics needs for office workers and professional picture editors. In the most extreme case of Iris Pro Graphics, you have an okay gaming solution for 1080p titles, but the price is far too high for what is offered at this point in time ($600 for the 4950HQ). What Intel can do about this going forward is anyone's guess, but in my next blog entry I will lay out a few ideas I've mused about and figured the likelihoods of.
-
- intel
- hd graphics
-
(and 3 more)
Tagged with:
-
Problem Description: I have a Lenovo ThinkPad L440 and the problem goes like this, while running any game (any at all, even low spec), I will get an extremely smooth 60fps experience, after a while though, the FPS will dip to a laggy 15-20fps. It will go like this for around 5 seconds, then returning to the smooth 60fps experience. It happens very consistently. Moreover, this isn't just games. It also happens while working with Windows or browsing the web. Specs: Windows 10 Version 1903 (OS Build 18362.239) Lenovo ThinkPad L440 Intel Core I3 4000M @ 2.4GHz Intel HD Graphics 4600 4gb DDR3 @ 1600MHz 466 GB HDD (Main Drive) My Troubleshooting Attempts: I have been monitoring on the GPU usage from Task Manager performance tab. I have noticed that as soon as the fps will drop, I will see an increase in GPU activity. For example, if the game was using 55% of the GPU while smooth experience, the GPU usage will jump upto 99% and I will experience the lag. (Playing browser game 'Krunker' in this case) (GPU usage attached below) I have also noticed that it wouldn't be lagging at first for a few minutes, and then the it starts lagging pretty consistently. While playing Far Cry 3, the GPU usage will be at around 99% and it would still start lagging, even though the GPU usage maxes out at 100%. Other Information: I have been experiencing this problem since ever I got this laptop. I have not messed up with any of the Windows settings, nor installed any new programs, nor messed up with hardware settings. Though, recently, I have tried messing with Power Plans, and setting CPU minimum power state from 5% to 100%. I have also set Graphics Power plan to Maximum Performance. Though none of this made any change. Also one more important point to mention is that I have the laptop plugged in while I game. I have no idea what's causing this and how to fix, need help! Thanks
-
Got 10 OEM HD6950s from local pc shop, those are from a internet café(a place of high-end PCs which allows players to rent them with few dollars per hour and play in it in China). Saw lots of OEM pcs there. They are dusted and some surface material are aged and peeled off from the backplanes(the material is like skin, I don't know which word to use in English, the same material as other AMD original geaphic cards, kayboards and ThinkPads, really noisy when aged), but after all thay are from 10 years ago, so I guess they were not mined(or they'll be some metal pieces). The tags declare they are from AMD original factory(the green tag), the 'D33A27' tag means AMD malaysia factory. The sellers, who obviously thought those was some stupid old graphic cards, sold them to me with the price of about 300$. And 8 of them works well! 2 cards is broken, but I checked them and find out it's only fan problems. Tried to sell 4 of them at $250 Sorry if caused any misunderstandings, grammar problem, I mean 4 of them working well with crossfire bridges, not one, I'm not mad, and no need to sell second garbage for money(got 23M points running F@H without curecoin or banano and cost about 50$ for power), just don't want them to be used in mining, some area in China commercial power is serious low and any cheap card is bought for mining and second hand sellers said 'I'll take 3 for $75' and I got really angry, so I decided to keep them to myself and build 2 AMD enthusiast level PCs. Searched Chinese second-handed markets for a few days and get all other accessories from 2010, the year HD 6000s are kicked by GTX 500s, and the last powerest AM3 cpu Phenom II X6 1100T released(and still can be a match of Intel Cores). There is an old saying in Chinese is '10 years is not late for a true hero to get his revenge', and this is the time for the RED get it's revenge. Here is the list (of course most of them are second-handed, which kind of sellers will keep them for 10 years?). Motherboards:2x ROG Crosshair IV Extreme. CPUs:2x Phenom II X6 1100T black ops version, with original cooler. GPUs:2x 4-way HD6950(unblock them into HD6970 if those old pals work well). Memory:2x 4x4G Cousair Dominator GT 2133 Mhz. Power supply: 1500W Enermax Maxrevo/1500W Sliverstone SST-ST1500. Still choosing chassises, storage and chassis fans.
- 12 replies
-
- hd graphics
- hd graphics 6000
-
(and 1 more)
Tagged with:
-
Hey there Im trying to use an old pc as my media center and connect it to my HDTV. It has an i3 2100 and an iGPU on win 10(dont really need more than that),the thing is the board only has DVI and VGA and im trying to use HDMI. So i got myself a DVI to HDMI adapter since i also saw in a few places online that DVI can use audio as well. Anyway the thing is that its not working and i guess i have to change something in the setting in order for it to work at all. could really use your help thx
-
So I typically play on PS4 so this is uncharted territory for me. I went home for the holidays and left my console in the dorms but I need to game so I'm trying to use my laptop for some light gaming. It's a Lenovo Yoga 3 Pro - 1370 and it's really not built for gaming. Intel(R) Core(TM) M-5Y71 CPU @ 1.20GHz (4 CPUs), ~1.4GHz (Up to 2.9GHz) HD Graphics 5300 But I was hoping I'd be able to get something going, even 720p gaming at the very least but it's all so bad. Every game lags and stutters. Anyway I'm using MSi Afterburner to monitor the system while I play (or try to) and I noticed that the CPU clock drops to sub-1GHz as soon as any game starts and stays there until the game closes then it's immediately back up to normal 2.6+. Then I noticed that as the GPU clock gets higher, the CPU clock gets lower. Does anyone know why this happens and if there's a way to fix it? I really just want any sort of gaming done even if it's 720,low as long as it's smooth and my PC as expensive as it was should be able to handle that. Thanks for any help at all.
-
hello everyone, i have a laptop without a discrete graphics card and only with and integrated graphics card. currently i have a Pentium P6200 with an integrated intel HD graphics. iam planning to upgrade to a quad core i7 but the quad core series that is supported by the laptop socket and chipset doesn't have integrated graphics and the only highest option i have is the intel core i7 640m @ 2.8ghz. so what do you guys think ? are there any solutions ? any and all opinions are more than welcome Thank you all
-
so until my build arrives im using a lenovo idea pad laptop for playing some games at there lowest possible settings so when i set the intel hd graphics power plan to high performance it gives me low fps compared to the when i set it to power saving what is the cause of that ?
-
Hey guys, http://ark.intel.com/products/93339/Intel-Core-i7-6785R-Processor-8M-Cache-up-to-3_90-GHz This is i7-6785R, a processor in Skylake-R series(including i5-6585R, i5-6685R). It has the same frequency as i5-6600(3.3GHz up to 3.9GHz). And it has Iris Pro Graphics 580 but not HD Graphics 530. It uses FCBGA1440 socket, which is same as i7-6700HQ, so i think it would be a embedded CPU(with 65W of TDP, it shouldn't be a mobile processor). Wow, Iris Pro 580 looks very good. With 1150MHz Boost clock(data from Intel ark), it may has better gaming performance than GTX 750Ti. And don't forget it also has 128MB of eDRAM to increase 3D performance. This CPU may be embedded in HTPC, to build a gaming HTPC which doesn't have graphics card. I would like to call it Intel APU. But the problem is, how can we find it? There isn't any motherboards or PCs with Skylake-R CPU. Nope, those CPU aren't for sale, they're OEM processors, we can't find them. In my opinion, it would be great if Skylake-R processors are able to sell. I hope to see an Kabylake-R processor with LGA socket next year. Ps. Sorry my English is awful, I'm Chinese
-
Hello everybody. I have been owning a Samsung RF511-S02 since 2012 (or somewhen around that time) and it has served me well throughout the years. Lately, though, i have been experiencing a serious issue which makes using the computer borderline impossible. My intel HD graphics 3000 keeps glitching the screen, showcasing a message stating "The HD Graphics Driver Crashed" (or something very similar); it basically makes the screen go black (or a flat colour, like blue or green) for a couple of seconds and then restores it on a loop, but after a couple of times the screen remains black until i reboot the computer, and formatting or using updated drivers hasn't changed anything. Sometimes the glitching is a bit more severe, with some evident pixel flickering and such. Sometimes the glitching occurs even in the bios window. It happens every single time i turn the laptop on, except for very rare occasions where i can use it for about 30 minutes, without major issues. The dedicated GPU (Nvidia GT 540M) seems to be working fine, and i'd like to use that card only, to avoid this issue all together, but i really can't find a way to bypass the broken HD graphics. The BIOS doesn't feature a GPU selecting menu (to disable IGP) and even if i want to use the nvidia card only, i can install the nvidia drivers if i don't instal thel integrated GPU first. I know this computer is old and i should get an other one, but i'm having very little money this period and this computer is the only tool i can use for whatever type of entertainment (Watching movies, browsing the web ecc...), and i'm now writing from a PC i borrowed...so even creative or dangerous solutions are accepted...as long as i can keep the laptop alive for at least 3,4 months while i save some money. Can the integrated GPU be bypassed completely or fixed somehow? (even by opening the laptop itself)? Thank you so much. Small note: the computer (even after a possible fix) won't be used for any specific 3D use (like games or modelling)
-
The base clock of my iGPU is 1050MHz. I am able to OC it from BIOS, up to 1200MHz : 2 to 10 fps increase, depending on in-game variations. Are there any other ways to OC the iGPU, or just make it perform better in any way? Thanks!
- 9 replies
-
- overclock
- hd graphics
-
(and 1 more)
Tagged with:
-
Hey guys, I just saw a pretty affordable Sony handycam that sport 4K and was thinking which Intel Integrated HD Graphics started to support native 4K H.265 codec ? I think i played 4K native video on my Intel HD3000 laptop graphic cards and it was horrible, which one started to support the codec natively ? Another question: Is AVC1 codec considered 4K Codec ? Because I downloaded a video sample from youtube (via Internet Download Manager) and it run smoothly and saw the codec info stated AVC1 as seen below
-
so i want to but a cheap laptop and i ended up with two options a laptop with a Celeron N2840 and one with E2 6110 but which one is better with what because some website say that the intel hd graphics are better than the radeon R2 graphics but the proccesor is better of the E2 so i got confused but what is better with graphics and which one is better with the proccesor the Celeron N2840 Or the E2 6110 i have the total specs of the laptops here BUT the website is in dutch i could not find a english website on this so good luck Celeron N2840 http://tweakers.net/pricewatch/431931/packard-bell-tg71... E2 6110 http://tweakers.net/pricewatch/434276/acer-aspire-e5-52...
-
http://www.legitreviews.com/intel-nuc-2-0-rock-canyon-broadwell-system-images-posted-intel_155157 http://www.kitguru.net/desktop-pc/anton-shilov/intel-publishes-pictures-of-next-gen-nuc-2-0-broadwell-based-systems/ So it seems the Broadwell-based NUCs will be shown off at CES as everyone expected. To me the specs are a little light in the loafers (Core I5 for the high end, Core I3 for the lower, both only dual-core), but still, HD Graphics 6000 with 48 EUs and Intel's 8th generation of graphics core (touted 40% greater throughput per EU due to macroarchitecture reorganization)... One NUC gets ethernet (Maple canyon), and the other gets 802.11 AC wireless (Rock Canyon). Why Intel didn't put both on Maple canyon I have no clue. Big ticket items are 2 USB 3.0 ports (one being supercharge-capable), DP and HDMI ports (whether it's DP 1.2 or 1.2a is unknown. HDMI 2 or older is unknown), interchangeable tops for NFC wireless charging, and an optical remote reader for HTPC people.Other than that, the form is a bit sleeker/sexier, but the same box. Currently fan/fanless is unknown. Not much else to say until we see a demo, but the system power draw of just 45 watts under load from the wall should make a lot of people very happy.
-
EPIC GLITCH!! Titanfall Beta on Intel HD Graphics 4600! This is hands down the most epic glitch that I have ever seen in any video game. If you know of one even more epic than this, please enlighten me.
-
Alright, guys, I apologize for not starting this thread sooner, however, we have ourselves a nice little chip here! I'm using an i7-4770S in an mITX HTPC/SFF build, and it holds up quite well! Especially considering the fact that this build only runs on less than 150W!! Here are the specs of the build: \Project\ Desktop HTPC: -Intel Haswell i7-4770S -ASRock B85M-ITX -Mushkin Enhanced Stealth DDR3 1600 8GB (2x4GB) dual channel RAM @ 8-8-8-24 -Samsung 470 Series 64GB SSD for Windows + immediate apps -Samsung Pro 840 Series 256GB SSD for storage + 'auxiliary' apps -Lite-On CD/DVD combo drive -Intel HD Graphics 4600 (on CPU) -ASUS Xonar Essence STX PCIe sound card -PicoPSU-150-xt 12V DC-DC ATX power supply -HiFiMAN HE-400 headphones -Acer H274HLBMID Black 27" LED backlit LCD (only 20 watts!) -Windows 7 Home Premium 64-bit -Case/Chassis: Rosewill RC-CIX-01 BK Glossy Black Steel Cube Mini-ITX -AVerMedia Live Gamer Portable capture device And a few videos to show off its graphical and performance prowess; remember, less than 150W!: Feel free to let me know what other benchmarks and gameplay you would like to see!
- 15 replies
-
- intel
- hd graphics
-
(and 8 more)
Tagged with: