There are two ways to do so. One is to create a utility function that checks the model name and call the regarding inference APIs. openai_models=["gpt-4","gpt-3"]open_source_models=["llama","alpaca"]google_models=["bard"]defcreate(model,**kwargs):ifmodelinopenai_models:# generate...
44 llmg 12 2024-12-02 2024-08-25 1 🧘 Extensive LLM endpoints, expended capabilities through your favorite protocols, 🕸️ GraphQL, ↔️ gRPC, ♾️ WebSocket. Extended SOTA support for structured data, function calling, instruction mapping, load balancing, grouping, intelli-routing....
Support Leading LLMs: Choose from a wide range of leading LLMs: OpenAI, AWS Bedrock, Google Gemini etc. Drag-and-Drop Interface: Visually design your API tests with ease. OpenAPI Spec Integration: Automatically parse and pre-fill request nodes from your OpenAPI specifications. ...
46 llmg 8 2024-09-07 2024-08-25 0 🧘 Extensive LLM endpoints, expended capabilities through your favorite protocols, 🕸️ GraphQL, ↔️ gRPC, ♾️ WebSocket. Extended SOTA support for structured data, function calling, instruction mapping, load balancing, grouping, intelli-routing. ...