Full iGPU and GPU Support Out of the Box (Intel, AMD, NVIDIA) for Home Assistant OS #522
Replies: 2 comments 1 reply
-
|
I run HA as VM so it doesn't touch me, I host stuff separately from HA but... |
Beta Was this translation helpful? Give feedback.
-
|
Definitely second this request. I am quite surprised it's not a hotter topic, or even a strong contender for top of the list on the roadmap, considering how much "appliance-focused" Home Assistant always has been. I run baremetal now for the simplicity (i want it always up even when i tinker in my homelab), but not being able to use my Beelink minipc's iGPU inside HA is such a waste... Just for local piper and whisper alone, without much more tinkering, it would be a godsend. I am already a Nabu Casa subscriber myself (i could very well do the remote myself, but prefer to support the project nonetheless) but i wouldn't be surprised if the traction for local AI inside such appliances would bring more customers indeed! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Describe your request
Hi Home Assistant Team,
Many users — especially power users running Frigate or local AI workloads — have been waiting for this for years:
We need full, native iGPU and GPU support for all major platforms (Intel, AMD, and NVIDIA) in Home Assistant OS, out of the box.
Home Assistant is built on a Linux-based system, so technically it should be very possible to provide proper GPU passthrough, driver support, and media acceleration without requiring complicated custom builds or workarounds.
If this were enabled by default or easily toggleable:
• Frigate users could dramatically reduce CPU load using Intel Quick Sync or AMD VCN.
• NVIDIA GPU owners could run powerful local AI models (like LLMs or vision models) directly on their system — perfect for offline use, privacy, and performance.
• It would make local voice assistants, image processing, and person detection much more powerful and accessible.
• Even small form factor devices (NUCs, Minisforum, Beelink, etc.) would become true smart home powerhouses.
This isn’t just a feature — it’s a huge opportunity to position Home Assistant as the platform for private, AI-powered smart homes.
And yes: many more people would absolutely be willing to support this by paying for Nabu Casa or sponsoring development. It’s a feature that unlocks entirely new use cases and removes the biggest friction point for technical users.
Please consider pushing this up the roadmap — the community wants it, and the future of local AI and smart home automation depends on it.
Thanks for your great work — we’re excited for what’s next!
Why the uncertainty?
No response
Use cases
See above
What areas might this affect?
No response
Anything else?
No response
Beta Was this translation helpful? Give feedback.
All reactions