Advertainment
TechnologyGame NewsNewsTechTech News

NVIDIA has released a demo version of a chatbot that runs locally on a PC

NVIDIA presented a free demo version of the Chat with RTX chatbot, which runs locally on your PC. This is quite interesting because it gives the chatbot access to your files and documents. You can feed it personal data, and then ask questions and get answers based on it.

Advertisement

The company says users can quickly and easily connect local files on their PC as a dataset for an open language model like Mistral or Llama 2.

NVIDIA gives an example of a user asking the chatbot about a restaurant recommended by its partner in Las Vegas. The bot scans local files looking for an answer. Many formats are supported: .txt, .pdf, .doc/.docx, .xml and others. The company claims that the AI ​​will load relevant files into the dataset “in seconds.”

Chat with RTX is also integrated with YouTube. By adding a link to the video to the dataset, he uses the acquired knowledge for contextual queries. According to NVIDIA, this will help in searching for travel recommendations based on the videos of your favorite bloggers or when compiling a short description of educational content.

The Verge has already tested the bot and is pleased with the chatbot, writing that it sees its usefulness in data research for journalists and other professions.

Advertisement

This is a big step towards a real digital assistant that works in the context of personal data. Unlike other chatbots, here the data is not sent to the cloud, but is processed locally. Thus, Chat with RTX is safer and more contextual.

There are also limitations. This is a demo version, so expect some bugs. There are also strict hardware requirements: the chatbot only works on a PC with Windows, RTX 30 or higher video cards and at least 8 GB of video memory.

 

Advertisement

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button