Overview

Integrate Valyu’s deep search capabilities directly into your OpenAI applications using our provider system with OpenAI’s Responses API. This enables your AI agents to access real-time information from academic papers, news, financial data, and authoritative sources.

Installation

Install the Valyu and OpenAI packages:
pip install valyu openai
Set your API keys as environment variables:
export VALYU_API_KEY="your-valyu-api-key"
export OPENAI_API_KEY="your-openai-api-key"

Free Credits

Get your API key with $10 credit from the Valyu Platform.

Basic Usage

The new OpenAI provider makes integration incredibly simple with the Responses API:
from openai import OpenAI
from valyu import OpenAIProvider
from dotenv import load_dotenv

load_dotenv()

# Initialize clients
openai_client = OpenAI()
provider = OpenAIProvider()

# Get Valyu tools
tools = provider.get_tools()

# Create a research request
messages = [
    {
        "role": "user",
        "content": "What are the latest developments in quantum computing? Write a summary of your findings."
    }
]

# Step 1: Call OpenAI Responses API with tools
response = openai_client.responses.create(
    model="gpt-4o",
    input=messages,
    tools=tools,
)

# Step 2: Execute tool calls
tool_results = provider.execute_tool_calls(response)

# Step 3: Get final response with search results
if tool_results:
    updated_messages = provider.build_conversation(messages, response, tool_results)
    final_response = openai_client.responses.create(
        model="gpt-4o",
        input=updated_messages,
        tools=tools,
    )
    print(final_response.output_text)
else:
    print(response.output_text)

How It Works

The OpenAIProvider handles everything for you:
  1. Tool Registration: Automatically formats Valyu search for OpenAI Responses API
  2. Tool Execution: Manages search API calls behind the scenes
  3. Conversation Flow: Builds proper message sequences with tool results
Important: This uses OpenAI’s Responses API (responses.create()), not Chat Completions!

Research Agent Example

Create a simple research agent that can access current information:
from openai import OpenAI
from valyu import OpenAIProvider

def create_research_agent():
    client = OpenAI()
    provider = OpenAIProvider()
    tools = provider.get_tools()
    
    def research(query: str) -> str:
        messages = [
            {
                "role": "system", 
                "content": "You are a research assistant with access to real-time information. Always cite your sources."
            },
            {
                "role": "user",
                "content": query
            }
        ]
        
        # Get response with tools
        response = client.responses.create(
            model="gpt-4o",
            input=messages,
            tools=tools,
        )
        
        # Execute any tool calls
        tool_results = provider.execute_tool_calls(response)
        
        if tool_results:
            # Get final response with search data
            updated_messages = provider.build_conversation(messages, response, tool_results)
            final_response = client.responses.create(
                model="gpt-4o",
                input=updated_messages,
                tools=tools,
            )
            return final_response.output_text
        
        return response.output_text
    
    return research

# Usage
agent = create_research_agent()
result = agent("Find the price of Bitcoin and Nvidia over the last 2 years, then find news about them both respectively, and write a detailed report on the price, news, and potential asset correlation.")
print(result)

Financial Analysis Example

def create_financial_agent():
    client = OpenAI()
    provider = OpenAIProvider()
    tools = provider.get_tools()
    
    def analyze_market(assets: list) -> str:
        query = f"Get the latest news and price data for {', '.join(assets)}, then provide a detailed market analysis report"
        
        messages = [
            {
                "role": "system",
                "content": "You are a financial analyst. Provide data-driven insights with specific numbers and sources."
            },
            {
                "role": "user", 
                "content": query
            }
        ]
        
        response = client.responses.create(
            model="gpt-4o",
            input=messages,
            tools=tools,
        )
        
        tool_results = provider.execute_tool_calls(response)
        
        if tool_results:
            updated_messages = provider.build_conversation(messages, response, tool_results)
            final_response = client.responses.create(
                model="gpt-4o",
                input=updated_messages,
                tools=tools,
            )
            return final_response.output_text
        
        return response.output_text
    
    return analyze_market

# Usage
financial_agent = create_financial_agent()
analysis = financial_agent(["Bitcoin", "Ethereum", "Tesla"])
print(analysis)

Configuration Options

Model Selection

You can use any OpenAI model with the Responses API:
response = client.responses.create(
    model="gpt-4o-mini",  # Faster, cheaper
    # model="gpt-4o",     # More capable  
    # model="o1-preview", # Advanced reasoning
    input=messages,
    tools=tools,
)

Search Parameters

The AI model can automatically use advanced search parameters based on your query context:
  • max_num_results: Limit results (1-20)
  • included_sources: Search specific domains or datasets
  • excluded_sources: Exclude certain sources
  • category: Guide search to specific topics
  • start_date/end_date: Time-bounded searches
  • relevance_threshold: Filter by relevance (0-1)

Best Practices

1. Use Clear System Prompts

messages = [
    {
        "role": "system",
        "content": """You are a research assistant with access to real-time information.
        
        Guidelines:
        - Always cite sources from search results
        - Provide specific data points and numbers
        - If information is recent, mention the date
        """
    },
    {
        "role": "user",
        "content": user_query
    }
]

2. Handle Errors Gracefully

try:
    response = client.responses.create(
        model="gpt-4o",
        input=messages,
        tools=tools,
    )
    
    tool_results = provider.execute_tool_calls(response)
    
    if tool_results:
        updated_messages = provider.build_conversation(messages, response, tool_results)
        final_response = client.responses.create(
            model="gpt-4o",
            input=updated_messages,
            tools=tools,
        )
        return final_response.output_text
    
    return response.output_text
    
except Exception as e:
    return "I apologize, but I encountered an error while processing your request."

3. Multi-Turn Conversations

For chat applications, maintain conversation history:
class ResearchChat:
    def __init__(self):
        self.client = OpenAI()
        self.provider = OpenAIProvider()
        self.tools = self.provider.get_tools()
        self.messages = []
    
    def add_system_message(self, content: str):
        self.messages.append({"role": "system", "content": content})
    
    def chat(self, user_message: str) -> str:
        self.messages.append({"role": "user", "content": user_message})
        
        response = self.client.responses.create(
            model="gpt-4o",
            input=self.messages,
            tools=self.tools,
        )
        
        tool_results = self.provider.execute_tool_calls(response)
        
        if tool_results:
            self.messages = self.provider.build_conversation(
                self.messages, response, tool_results
            )
            final_response = self.client.responses.create(
                model="gpt-4o",
                input=self.messages,
                tools=self.tools,
            )
            assistant_message = final_response.output_text
        else:
            assistant_message = response.output_text
        
        self.messages.append({"role": "assistant", "content": assistant_message})
        return assistant_message

# Usage
chat = ResearchChat()
chat.add_system_message("You are a helpful research assistant.")
response = chat.chat("What's the latest news about renewable energy?")

API Reference

OpenAIProvider

class OpenAIProvider:
    def __init__(self, valyu_api_key: Optional[str] = None):
        """Initialize provider. API key auto-detected from environment if not provided."""
    
    def get_tools(self) -> List[Dict]:
        """Get list of tools formatted for OpenAI Responses API."""
    
    def execute_tool_calls(self, response) -> List[Dict]:
        """Execute tool calls from OpenAI Responses API response."""
    
    def build_conversation(self, input_messages, response, tool_results) -> List[Dict]:
        """Build updated message list with tool results."""

Additional Resources