AI tools are the tentacles exposed to AI models in order to allow them to access and drive external functionality. The primary form this takes for LLMs is function calling. We have also seen so-called "computer use" and web browsing designated in similar fashion. For code generation, we could also consider the ability to integrate dependencies in the form of external packages as a form of tooling. In other words anything that extends the capabilities of the AI/LLM model could be thought of as a tool. We can then extend this notion into the runtime environment for the code that LLMs generate by generating code that depends on runtime dependency injection. Late Binding Dependency injection is accomplished via "late binding", meaning the actual implementation of a given functionality isn't specified at the time code is written, until at some later point before execution of logic. Late binding can be implemented at the language level, such as in C++ you have virtual...
We have released updates to the Solvent-Botworx platform that includes the introduction of automation programs and the addition of multi-modal capabilities. Multi-modal speech capabilities in particular are an intriguing development for integration into "deep-work" work-spaces. One could imagine a future where anyone doing heavy-duty cognitive work will be working within a software workspace with seamless multi-modal AI integrations. With the right type of integrations, multi-modal capabilities offer the possibility of true AI aided assistant capabilities. Below are demo videos. Speech modality demo: Image and speech modality demo: