The concept of developing an overarching AI that “lower AIs” could rely on when necessary (mentioned in my AutoGen post) was adopted by AutoGen and led to the creation of the EcoAssistant.

The EcoAssistant, detailed on Microsoft’s AutoGen blog, is an innovative system designed to enhance the accuracy and affordability of user query resolution using large language models (LLMs).

Key features of EcoAssistant include:

  1. External API Integration: EcoAssistant leverages external APIs to expand the knowledge base of LLMs beyond their pre-trained information, allowing for more accurate responses to user queries involving external data.
  2. Two-Agent System: The system comprises two agents: an LLM assistant agent responsible for code proposal and refinement, and a code executor agent that runs the code and relays the output back to the assistant agent.
  3. Security Measures: To maintain security, EcoAssistant uses fake API keys in the initial message, which are later replaced with real API keys by the code executor, ensuring that the real API keys are not exposed.
  4. Solution Demonstration: This feature involves storing successful query-code pairs in a database. When new queries arise, the system retrieves the most similar past query and its code, improving resolution efficiency and system performance.
  5. Assistant Hierarchy: To minimize costs, EcoAssistant employs a hierarchy of LLMs, starting conversations with the most cost-effective model and escalating to more expensive models only if necessary. This strategy reduces reliance on higher-cost LLMs.
  6. Synergistic Effect: The combination of Assistant Hierarchy and Solution Demonstration enhances the system’s overall performance and cost-effectiveness. Solutions from more powerful LLMs can guide less capable ones, creating a synergistic improvement.
  7. Performance Evaluation: EcoAssistant outperformed a single GPT-4 assistant in terms of success rate and cost-effectiveness when tested on datasets related to places, weather, and stock queries.

Overall, EcoAssistant represents a significant advancement in utilizing LLMs for query resolution, offering a more accurate, cost-effective, and secure solution.