• NVIDIA released a demo version of a chatbot that runs locally on your PC, giving it access to your files and documents.
• The chatbot, called Chat with RTX, can answer queries and create summaries based on personal data fed into it.
• It supports various file formats and can integrate YouTube videos for contextual queries, making it useful for data research and analysis.
I recommend jan.ai over this, last I heard it mentioned it was a decent option.
I use https://huggingface.co/chat , you can also easily host open source models on your local machine
i have no need to talk to my gpu, i have a shrink for that
Idk I kinda like the idea of a madman living in my graphics card. I want to be able to spin them up and have them tell me lies that sound plausible and hallucinate things.
That was an annoying read. It doesn’t say what this actually is.
It’s not a new LLM. Chat with RTX is specifically software to do inference (=use LLMs) at home, while using the hardware acceleration of RTX cards. There are several projects that do this, though they might not be quite as optimized for NVIDIA’s hardware.
Go directly to NVIDIA to avoid the clickbait.
Chat with RTX uses retrieval-augmented generation (RAG), NVIDIA TensorRT-LLM software and NVIDIA RTX acceleration to bring generative AI capabilities to local, GeForce-powered Windows PCs. Users can quickly, easily connect local files on a PC as a dataset to an open-source large language model like Mistral or Llama 2, enabling queries for quick, contextually relevant answers.
Source: https://blogs.nvidia.com/blog/chat-with-rtx-available-now/
Download page: https://www.nvidia.com/en-us/ai-on-rtx/chat-with-rtx-generative-ai/
They say it works without an internet connection, and if that’s true this could be pretty awesome. I’m always skeptical about interacting with chatbots that run in the cloud, but if I can put this behind a firewall so I know there’s no telemetry, I’m on board.
it gives the chatbot access to your files and documents
I’m sure nvidia will be trustworthy and responsible with this