Everything you need to know to Empower Your Secure AI Journey
When a user manually uploads a document/transcript or website, he will only have permission to view and ask a question on that content.
When forwarding an email - all users who received the original mail will have access to that email
When integrating systems like SharePoint, Teams chat, and Google Drive, Pragatix syncs the permissions from the source system so only users who have access to the source system can ask questions or get answers from that content.
If permissions are changed in the source system, Pragatix syncs them back to the dashboard.
Every user can only see items he has access to and, therefore, can only ask questions on content he has access to.
When asking a question about all company data, the system will find the most relevant documents that the user can access, and only these documents will be sent to the AI to generate the answer.
You can deploy it in your own AWS tenant or on-prem.
In a database on your servers. It never gets sent to other servers.
We run a local LLM that is totally standalone and doesn’t require internet connectivity to be used. We currently use Meta’s Llama3 model, which is open source and free for products with less than 700 million users.
It requires one Windows server and two Linux servers as well as at least one GPU with a minimum of 24 GB of VRAM. More details here.
No, we store customer data in segregated vector databases and use the RAG methodology to provide answer generation.