My absolute favourite part of them not supporting this, the raspberry pi zero does have HDMI-CEC. So a $5 embedded board can do it but the entire line of desktop GPU’s cannot.
Do you know that it’s 2017 and not 1995 anymore ?
because US20080172504A1 - Interactive control apparatus using remote control signal between computer and electric home appliance - Google Patents
HDMI-CEC for PC is not free.
I’m using a UHD Sony for a monitor at 4:4:4 60hz and I’d like it to wake using CEC. I can’t get another device to do the CEC as the UHD at 60hz is not supported by such devices. This needs to be supported by nvidia and I expected my 1080 ti to have such features. Is this the right place to petition? Perhaps the patent will expire in ten years and we’ll have it?
Bumping this as well. Its crazy this isn’t on the 10XX line. With Nvidia now getting into the TV market with BFGD, the idea this STILL isn’t supported is weird. Will BFGD not have CEC? I’d assume they will.
Is this going to continue to be ignored? what’s the deal here? It cant seriously be a patent issue that is holding this up? Do they honestly think that a feature as pivotal as this should be left out? TVs are continuing to cater to PC users with higher refresh rates and better support for input yet NVIDIA still cant control devices though HDMI. Missing out much?
I just buy an oled 4k lg ,it is a smart tv with his proper os. I connect it to my pc with hdmi cable.
But when pc do his job and go in sleep mode ,oled tv stay on waiting for signal.
It seems unbelievable but there is no way to send cec signal to tv with gpu ,only thing i can do is buy again a device to send the tv a cec signal.
And windows have nothing to do with display this time.
I mean, gpu recognise what kind of tv i use ,at such resolution, with hdr, in 10 bits…
but it cant turn off my tv!
Are we walking on the head?
How is this still not implemented?
could you please forward an “improvement” task to your developers with apparently high “demand/impact” regarding support of HDMI-CEC.
A very common use-case and therefore yielding great benefit/added-value for the end-customer is:
The end-customer has a (gaming) PC connected to a HDMI-CEC enabled TV (e.g. in the living room), which he wants to turn on automatically when the PC with the nVidia graphics card is turned on.
HDMI-CEC nowadays is a widely-available common standard that is supported on a wide range of devices.
HDMI-CEC would take away die estate on the gpu. For every person that wants HDMI-CEC there are at least 50 persons that would rather spend that die estate on performance, having the transistors to do 1 Op/cycle more I’d guess. So I might be wrong, but I’d say nope, not going to happen, forget it, no dice.
Since I know about the value of good CEC support, I always check support for it on buying new CE stuff and heavily use it, I still understand your ache. So it might be better to instead bug some vendor that actually puts the boards together to include CEC support through external circuitry and maybe Nvidia could help by standardizing the GPIO/I2C used for that so that the Linux kernel could actually pick it up. Outlook for that to happen still bad, though.
Seriously? You think that adding some extra meta data packets to the HDMI transport is going to take up extra die space? I would bet my dog it could be implemented in firmware on a majority of modern cards (by a 2-5 person team in a couple months).
They know that a feature like this would improve the experience on media piracy on PC.
The obvious reason is there is an under the table industry agreement to keep this standard from working on PC for regular consumers. Or NVIDIA has specifically avoided doing it to dodge the ire of the media conglomerates.
There is no reasonable argument from the perspective of the consumer not to have HDMI-CEC on consumer GPUs. It’s purely corporate back-scratching.
Don’t be such a damn shill generix.
I don’t see the connection, could you please elaborate on this?
I suspect some other reasons, e.g.: Legal (i.e. fees), some crude financial considerations (low demand, would not payoff) or technical reasons (?).
TBH I’m not familiar with the low-level gpu and board layouts/designs, but being a (high-level) programmer myself I would also have considered implementing this via firmware.
These very points are the reason I would like nVidia to open a ticket in their system to at least get feedback if it is one of the aforementioned reasons - If they considered the demand would to be too low, I think they under-estimated/miscalculated.
We can only speculate about the reasons, so a ticket would hopefully shed some light on this long- overdue feature.
Still dreaming Nvidia could produce this…
I’ve switched over to using a PC with a 3090 card this generation, and honestly one of the few things I really miss from console land (or even just Chromecast) is having the TV automatically turn itself on.
This is a common feature of extremely low end and inexpensive hardware, so I was honestly very surprised it is missing from such a high end product.
Hey NV, please note I want this too. It’s ridiculous you guys haven’t added it yet. It’s not often I find the budget for one of your products, and I’m severely disappointed your latest still don’t support such a basic function for anyone who hooks their big rig up to their big screen.
We had discussed internally about the “HDMI-CEC support” feature and it is not going to be implemented at this moment as it needs hardware changes on the board and more planning.
We will re-evaluate the requirement in future and may be able to give more insights on it. Thank you for the understanding and apologizes for not considering the request at this moment.
Everyone on this thread: I HAVE NO IDEA HOW CIRCUIT BOARDS OR BUSINESSES WORK SO I KNOW BETTER THAN YOU HOW TO BUILD YOUR PRODUCTS!!!
Literally one guy: … Uh yall ever heard of patents and licensing?
Nvidia talking head: No we are not scrambling to implement this when you nerds only care about render times and FPS in all of the marketing data we buy from google. But I asked the people in charge of such conversations and they said “well we haven’t really put any work into it. But Ill take a look when I have some time. For now we have to cram more CUDA cores into the next release…”
Me: you… Friggin… Vocal minority… Morons!!! Yall do realize that little raspberry pi Broadcom chip that does (now, I guess. That was missing some time ago…) have the CEC functionality physically and firmware-ly implemented has SO MANY OTHER FEATURES THAT ARE PHYSICALLY DISABLED (usually by efuses permanently crippling the die) BECAUSE THE BUYER (RPi Foundation) DID NOT LICENSE THOSE FEATURES ON THAT CHIPSET, RIGHT?!?
FFS, for trying so hard to sound indignant and superior, ranting about ‘just turn it on’ without making the slightest effort to learn about how it’s implemented and whether or not it is physically possible on these cards as they’re sold, choosing to instead glom onto whatever rhetoric other edgelords are throwing around in this thread… You think with all that energy to burn trying to guilt Nvidia into implementing something, I will agree is so easy could have been implemented right off the bat however think about it from nvidia’s view their market research probably tells them most gamers don’t even know what CEC is much less would care to implement its features… You’d think with all that pretend brain power one of you would have tried to think about it from the business’s point of view, as in trying to keep revenue high enough that people’s paychecks keep clearing, and from a design point of view, or having the internal integrity to separate your want for a feature from the waiting of feature feasibilities by someone trying to sell a product.
Dear Nvidia, yes, please do all you can to implement all the features possible, regardless of what your (FLAWED) purchased market research and focus groups say. The more capabilities the wider the potential market. I won’t lie those wider potentials will not translate into epic gains on your quarterly reports but they would comprise a net non-zero upward trend. Or (better yet, IMHO, since we are still in this maker era where people are becoming more comfortable working on electronics) just leave a unpopulated header or solder pads we can drop/plug our own little cheap i2c MCU into for talking on the datalines that access those features for those who want them. Personally I’d just love an easy access i2c solution for my desktops…not everything needs to be an arduino hooked up via serial to an overkill spec’d RPi looped through AWS to read a 20 cent temp probe because the person just plugging stuff in and copying code off instructables(or other similar dumpster fire) has put so little thought into actually learning the hardware they cant fathom simplifying the data chain kind of convoluted… My desktops all have i2c buses, they have GPIOs, not as many as a Raspberry Pi has but any given desktop does have gpios. How do you think it turns on or blinks the LED when you’re accessing your hard drive? And you Nvidia, believe it or not, would gain an admittedly small but still non-zero more customers simply by giving the user more access to the buses on the hardware they own. Especially in this climate where it feels we the end users have less and less rights and control over hardware that we’ve paid hundreds if not thousands of dollars for. Go ask your engineers about the addressable LEDs you put in your products I guarantee they will say they’re either ws2812s or sk6812s, both of those are common addressable RGB LEDs the first ones are known as neo pixels on adafruit. But a few of you companies tried fiddling the protocol to make it seem like your lights were proprietary to your protocol, of course us makers quickly reverse engineered your protocol and discovered it was exactly the same just clocked a little faster and built our own controllers and drivers that can go across brands easily. Luckily enough of you caught on to and have implimented a standardized protocol that such shady practices didn’t entirely undermine what you were trying to do with the artificial proprietary environment thing. Seriously though, down here in the bushes where the market research doesn’t reach, it would cause A LOT of buzz to have more access to data buses and even basic gpio interfacing that isnt an overpriced electrical engineering niche industry pci card for our desktops, and old but still good mobos and hardware many of us have laying around… And that buzz is money in the bank. Tell marketing to shove it and ask the engineers to work on a simple prototype card, I guarantee you show it off at a maker Fair people will lose their minds
Tank R. - Awesomeness Consultant. ‘You have a thing, I can tell you how to make that thing awesome.’
Why bother have 2xHDMI ports on the Current generation of GPU’s if you are not going to support all the HDMI standards that TV’s AVR’s and Projectors use.
Just have Display Port only else it makes no sense in having multiple HDMI ports on the GPU as Nearly all current HDMI products use and support HDMI-CEC. Major brands support it. Sony, Philips, Samsung, LG, Panasonic, etc etc the list goes on and they all support HDMI-CEC on all products that use HDMI as standard.
So please tell those bull headed devs of yours to pull there head out of the sand and support HDMI fully or just stick to Display port only.
The problem you are creating is that if there is a HDMI port on it it tends to support CEC and We as DEV’s and programmers are tired of hearing this complaint from customers. You are causing a support issue for devs and that’s costing us time and money.
So just support it already like every major Brand.
It is 2023
Just found out my 3080 don’t support CEC…