plano/demos/employee_details_copilot_arch
2024-10-01 13:28:53 -07:00
..
api_server Added Float type to the function parameter values (#77) 2024-09-25 13:29:20 -07:00
grafana Added Float type to the function parameter values (#77) 2024-09-25 13:29:20 -07:00
prometheus Rename bolt_config to arch_config (#100) 2024-09-30 18:47:35 -07:00
Bolt-FC-1B-Q4_K_M.model_file Added Float type to the function parameter values (#77) 2024-09-25 13:29:20 -07:00
bolt_config.yaml Added Float type to the function parameter values (#77) 2024-09-25 13:29:20 -07:00
docker-compose.yaml update path for arch_config.yaml file (#107) 2024-10-01 13:28:53 -07:00
README.md Added Float type to the function parameter values (#77) 2024-09-25 13:29:20 -07:00

Function calling

This demo shows how you can use intelligent prompt gateway as copilot to explore employee data by calling the correct api functions. It calls appropriate function and also engages with user to extract required parameters. This demo assumes you are using ollama natively.

Starting the demo

  1. Create .env file and set OpenAI key using env var OPENAI_API_KEY
  2. Start services
    docker compose up
    
  3. Download Bolt-FC model. This demo assumes we have downloaded Bolt-Function-Calling-1B:Q4_K_M to local folder.
  4. If running ollama natively run
    ollama serve
    
  5. Create model file in ollama repository
    ollama create Bolt-Function-Calling-1B:Q4_K_M -f Bolt-FC-1B-Q4_K_M.model_file
    
  6. Navigate to http://localhost:18080/
  7. You can type in queries like "show me the top 5 employees in each department with highest salary"
    • You can also ask follow up questions like "just show the top 2"
  8. To see metrics navigate to "http://localhost:3000/" (use admin/grafana for login)
    • Open up dahsboard named "Intelligent Gateway Overview"
    • On this dashboard you can see reuqest latency and number of requests