Mintplex Labs develops AI-driven applications, including AnythingLLM, which enables users to interact with documents through LLMs. The aim was to improve efficiency and reduce distractions during research tasks.
Users experienced interruptions and a loss of focus when switching between research documents and AnythingLLM to type out queries. The need to frequently shift attention was seen as a significant time sink.
We designed and implemented a speech prompt feature that allows users to ask questions verbally while continuing their research without losing focus or breaking eye contact with their documents.
Integrated speech-to-text functionality within AnythingLLM, enabling users to activate the feature with a key combination, speak their queries, and receive answers without switching contexts.
The new feature significantly enhanced user experience by streamlining the query process, reducing time spent switching contexts, and maintaining user focus. It also demonstrated the potential for further AI-driven productivity enhancements.