Search the Community
Showing results for tags 'hdcp'.
-
Hello LinusTechTips community, So after 3 days of intensive testing, I managed to find the issue that prevented my browser from reproducing content from my tv service provider (Vodafone PT). The problem is that the content will NOT reproduce if ANY of my displays are not HDCP capable. As you can see on the first picture, all of my 3 monitors are HDCP-capable. They are indeed HDCP-capble but ONLY when turned on. NVIDIA control panel shows me that a certain monitor is not HDCP-capable if such monitor is off, like in the following picture. So, my question is: How can I prevent this issue from happening? I want to watch TV on one of my secondary displays, but I do not want to have all 3 displays turned on if I'm not using the third one. I know I can change my display settings in Windows and deactivate the display I do not want to use while watching TV, but that's not very pratical, right? I always have my 3 displays configuration activated and I just turn on and off the displays I need in a certain moment. Any solution? Is this an issue NVIDIA might be aware already, and trying to fix? Thanks in advance for any information you can provide. Best regards, José Reis.
-
So, I'm looking for a HDMI KVM switch for 10+ Windows10 PC's using RTX 30 series cards that will also be compatible with HDCP. I have found one that states HDCP 1.3 but I'm not sure how to know what version of HDCP my PC's need. TIA
-
Hi guys! I'm trying to get my Surface Pro 7 to miracast to my gaming rig that way I can use the surface as a second monitor. Miracast seems to work perfectly fine on the surface (go figure) but my desktop says what the title says. I've looked at numerous troubleshooting forums and didn't get any truly helpful information so hopefully I can get some real help. As far as the steps I've taken, I've done all the steps on this forum: https://www.minitool.com/news/your-pc-doesnt-support-miracast.html This includes: reinstalling my network adapter driver making sure my ndis version is sufficient making sure my display driver model is sufficient making sure I'm connected to the network (of course) The two problems I'm having with the article is 1: I can't set my Wireless Mode Selection to auto, as it doesn't appear in the advanced tab for my wireless adapter. 2: is, again, the title. Where it says miracast is available but with no HDCP. I've read in other places that HDCP may not be compatible with older hardware, but this desktop is from 2018. It started with Windows 10, hasn't had any other version or OS before. I'm also running an MSI Aero GTX 1080 and an i7 8700k, so I know this computer hardware is equipped to run miracast. I'm not sure if it may be BIOS setting that got fudged in the setup process, or if there's something else I'm missing. All of the search results I've found barely give any information on troubleshooting the problem, and most of the time it seems to be other 3rd party devices that have these problems. However, these are both newer, strictly Windows 10 devices, so I don't know what I'm missing exactly. Thank you for reading! I hope someone can help, I've been working on this for two hours now to no avail! Ya'll are my final lifeline
- 1 reply
-
- networking
- wireless display
-
(and 2 more)
Tagged with:
-
I am NOT pirating or trying to anything illegal just looking for help. So I just got a capture card (EVGA XR1 Lite) to use for streaming video games and my lectures (teacher). I thought everything was supposed to be plug and play but when I created the scene in OBS the all that comes up is a big ole 'HDCP'. I have an AMD rx5700 in which I have now disabled anything to do with HDCP and the problem still persists. I hooked up my gopro and it worked fine but when i hooked it up to my gaming or streaming pc it would not. Thank you.
-
I recently realized that two of my Sony receivers (in different rooms) had both stopped outputting HDMI signals to the two connected televisions. Both systems were working fine for 2+ years before "it" happened. I've already been through a paroxysm of powering off and reconnecting cables but have not reset memory in any device (other than powering off). The larger receiver has TWO HDMI outputs, and I eventually discovered that it is happy to output video on the non-ARC connection. I can even switch between the two outputs using my remote and when I select the ARC output the signal stops. The smaller receiver only has an ARC output and so I can't get a signal through it. I suspect a software update. The television connected to the larger receiver is from 2014, and while it shouldn't be getting system updates it does appear that at least one of the apps ( YouTube I think ) have been updated. Both receivers have a Roku (could they have initiated something I've seen referred to as an HDCP lockup?), and the second television is an Android TV and has received updates recently. I should mention that one of the Roku's gets unhappy about HDCP every few days , even when connected directly to the television . I am reaching out for any further insight on possible causes and steps to fix. I'm trying to avoid resetting the larger receiver as it has assignable inputs and will take a while to reconfigure. And of course I'd like to feel that what I do makes some kind of sense.
-
My windows tablet has type c ports only, I want to connect a 1080p monitor to it. I am going to use it for work and for watching netflix in 1080p without HDR. While searching for USB-C hubs from Amazon I found that one of the hub had "HDCP Complaint" label on it. It is more expensive than others. After reading more on different blogs I understood that HDCP is only required for 4k content and for that HDCP 2.2 is required and the dongle must support 4k60. But the usb hub which had HDCP complaint label was stated that it supports 4k30 only, I am sure that my tablet and monitor are HDCP compliant. Will the hub cause any problems while watching Netflix? Are the other USB-C hubs which state that they can support 4k30 but does not have the HDCP complaint label useless?
-
- hdmi
- type c hub
-
(and 2 more)
Tagged with:
-
Several months back a buddy of mine gave me a couple of monitors. One of which was an HP w2338h 23 inch 1080p monitor, for free mind you! It worked flawlessly for a couple of months, until I tried to use an old rocketfish hdmi splitter on it. After plugging this inglorious little bastard in to my monitor, things were never again so great. Now the monitor has an odd little quirk where my raspberry pi can display fine over HDMI, but my PC and PS4 are an absolute no go. What's more is that this same issue occurs on another monitor that the splitter was used on. I've tried other HDMI cables, no luck. Just want my monitors to work again. Hoping I'm overlooking something here
-
Hi, im new to the forums. I cant find the info i want anywhere else on the internet unless i skipped something, i have been looking for 2 days. The issue is i just bought a new 4k tv (lg 55uh6150) and a new receiver (pioneer vsx-831-k) now im trying to get my pc to dsisplay movies youtube etc to my tv from my graphics card with HDMI (gtx 780 classified) but when i am just displaying the screensaver on my monitor i get a flicker at the bottom of the screen and the black task bar sometimes moves up and down really fast, but when i drag you-tube or any other screen over to it the flicker stops completely. what could cause this? any help would greatly be appreciated! All the drivers are up to date, im running high speed hdmi cables that support hdmi 2.0a
- 2 replies
-
- screen flicker
- hdmi
- (and 4 more)
-
My best guess is that with WMC support being essentially gone, the 1070 is too new or something... I'm fighting to keep my HTPC functionality alive and current as possible. See here: "Your graphics card or driver doesn't support content protection." See more info in this post: Anyone have cablecard HTPC experience? i figured that since this is now indicated by WMC as being related to my graphics card, I would post here too. Thanks for reading!
-
I have two monitors. One is HDCP 1.4 (Acer XB271HU) and the other is HDCP 2.2 (LG 27UK850-W). I was wondering is there anyway to bypass the HDCP on the Acer monitor over display port. I've seen people mention ways to do it over hdmi. But none using display port.
-
Our newer TV broke so we've pulled this older TV https://www.sony.com/electronics/support/product/kds-r60xbr1/manuals out of our basement for the sake of saving some money. We're trying to hook up our Roku 3 to it directly and no matter what we try, we keep getting this screen. ] To start, after each attempt we restart both devices. We've also messed with the display settings on the Roku and it won't even play 720p video with out this error pop up. We can't find anything on HDCP in the TV manual and there's nothing on it's menu screen. We've tried at least 2 different HDMI cords in the multiple HDMI connection slots. Currently, there is nothing else connected to the TV. Update: We tried running our DVD player on it and same problem. IDK if that narrows anything down but... Any help or direction is really appreciated!
-
Not sure exactly where this should go. I have been fighting with stuttering on my system for a while now and have finally fixed it and what to share what worked. Any of the recent Radeon Adrenaline updates (18.x) have resulted in me getting stuttering in Windows 10. Not in any specific game or application, but the entire system. If watching a video, it was like having the video stop and buffer every 3 or so seconds. This fix is specific to any system using an HDMI Monitor connected directly to the HDMI port on the RX 580 and not using a DP to HDMI adapter Heres the relevant system components: CPU: AMD Ryzen 5 1600 GPU: Gigabyte Aorus RX 580 8GB MOBO: Gigabyte AX370-Gaming K5 RAM: GSkill Ripjaws DDR4 3200 Monitors: LG 29UM59 Ultrawide (1-DP using adapter, 1-HDMI !! IMPORTANT !!) GPU Driver: Radeon Adrenaline 18.8.2 Windows Build: 17134.254 Windows Version: Windows 10 1803 Update Heres what worked: Open AMD Radeon Settings Go to "Display" Select the HDMI Monitor, click on "Specs" Go to "Override" Disable HDCP Support Took me forever to figure this out, so I wanted to make it easier for anyone else having the same problem. Your mileage may vary
-
Hello dear people, So here is the thing. My friend has a big tv and it is quite old. The monitor has a DVI connector, SCART, S-video, component (I think that is how it is called) and some other old connectors. It doesn't have HDMI, neither does it have VGA. She doesn't have a cable tv because people that had previously lived there didn't pay and now the bill is very large and she doesn't want to pay it nor does she know the people that were living there. The solution is that she would buy an Android tv box. It is this one, W95. The box works fine, we have checked it on my tv, although the sound over hdmi cable is unusable and choppy, we will connect a set of speakers to the box so that is not a real issue. She needs a HDMI to DVI adapter in order to connect it to the tv. We have bought one and tried to connect the box to the tv and the screen is blank, just a black picture. Tried to connect a laptop on the same cable and HDMI to DVI adapter, works fine. Tried changing some settings on the tv, nope, still a black screen. After a lot of googling I think it may be the HDCP compliance thing. BUT, I have read that all new laptops have HDCP and the laptop worked fine when connected to the tv. Why wouldn't the Android tv box work then? Am I missing something or is it HDCP that is a problem? Do you have any idea what would be the cause or maybe have a suggestion or a solution? If you need any info please ask, I may have missed out on something.
-
- android
- android tv box
-
(and 3 more)
Tagged with:
-
Hi guys, I recently bought a new TV (Samsung JS 9800) and I have and HTPC I'm building for it. I have a Blu-Ray Drive that supports 4k Blu-Ray (just waiting for software) but I found out that you need HDCP 2.2 for 4k Blu-Ray playback. I was wondering, does the R9 390 support HDCP 2.2 (I know the R9 390 is HDCP compatible, I don't know the version), or does that depend on the TV? Thanks, Arush
-
Me and my family are on vacation and we are trying to use Vudu to watch a movie on the big TV in the hotel room with an HDMI cable from my laptop. However, when we try to play it, we get an error saying that it cannot play because of copyright issues, because the TV is not HDCP compliant. It seems like we are probably just screwed, but is there any way to get around this?
-
So I got The Force Awakens on iTunes on my laptop. --I know iTunes is bad --I wanted HD instead of SD, otherwise I would have bought an actual DVD --I don't know if the Google Play store lets me buy a movie and download it locally bla bla bla so I chose iTunes I tried to play it in HD, but it said this: "To play this movie in HD, you must have a PC with a built-in display or have it connected to a display that supports HDCP." It plays in Standard Definition just fine. My laptop is a Dell XPS13 9333 -Core i7 4650U with Intel HD5000 I am using a MiniDisplayport to HDMI adapter that is HDCP compliant. My TV is a Samsung Series 6 UA32F6400AM. *Supposedly all HDMI TVs support HDCP *The user manual for this TV does not mention HDCP anywhere *The settings of the TV do not mention HDCP It seems like something in the hardware chain is not supporting HDCP. WHAT THE HELL IS IT? IS IT THE DAMN TV? I checked all this stuff out BEFORE I bought the movie - I was sure the HDCP would be fine. pls help
-
I copied this from another forum, I just want the most answers possible. Thank you. So I wanted to connect my PS4 to my PC monitor just because it is more convenient to me. I know HDMI is limited to 60Hz no problem console is the same. My problem is that I cannot get a picture through HDMI on this monitor. HDMI works with my PC just fine (had to test if the monitors port was bad) I use DVI on my PC normally. I disabled HDCP on the PS4 set resolution to 1080p on it as well and I got a picture for about 10 minutes then it went black again... Still get audio coming through. The PS4 works perfectly on all of my TVs. I have tried all of the available resolutions on the PS4 too and no luck... I also tried different HDMI cables nothing helps thus far. I can hear the menu of the PS4 but I cant get an image. Please, any suggestions are welcome!
- 4 replies
-
- playstation
- ps4
-
(and 6 more)
Tagged with:
-
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127783&cm_re=750ti-_-14-127-783-_-Product im looking at buying that gpu, however, i do not fully understand what hdcp is, is that basically just if you want to put an hd movie into your pc and watch it? or is it for all "hd" formatted videos. ex: i have some videos on my pc that are supposed to be blu ray or hd format, will i be able to still watch those if my moniter isnt hdcp ready? the moniter (tv) does have hdmi, i just want to make sure that i wont lose the ability to watch the videos i already have if i do in fact order this card. hope to get a swift reply, looking to make a purchase of a gpu today as its the last thing i need to complete my build. thanks in advance
-
Well it took them long enough IMO, but now the Playstation blog has an article on it stating that in the coming weeks a new update (presumably 1.70) will include a patch for switching off HDCP among other updates such as the addition of video editing software to the OS and the ability to have the 'Share' feature drop files into a USB drive for easy export along with the potential for longer clips to be recorded. This means people who want to capture gameplay using Elgato/PC cards without the use of HDCP strippers will be able to do so freely, Twitch streams are apparently going to receive a bitrate buff, and Twitch sessions streamed from a PS4 will now support archiving. All in all good stuff IMO, it's the next best thing to a PC, haters gonna hate. (I said next best thing) Source http://blog.eu.playstation.com/2014/03/19/new-ps4-update-add-share-enhancements-hdcp/ INB4 PC master race comment.
-
I have a HDMI splitter with HDCP v1.2. Is there a way to flash it or update it?