You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
When running with a remote vs local ollama server the smart_scrapper_graph.run() hangs and does not return (left running overnight as well just in case)
the usage on the remote machine goes up so the requests are happening
same model on both remote and local machine
tested with all v1.4x versions
tested different temperatures
checked with and without base_url parameter when running locally (both work fine)
tested with and without headless
ollama v0.6.1 (both machines)
To Reproduce
fromscrapegraphai.graphsimportSmartScraperGraphgraph_config= {
"llm": {
"model": "ollama/llama3.2",
"max_tokens": 8192,
"base_url": "http://192.168.1.1:11434",
"format": "json"
},
"verbose": True,
"headless": False,
}
smart_scraper_graph=SmartScraperGraph(
prompt="I want a list of all the links of the issues on the page",
source="https://github.com/ScrapeGraphAI/Scrapegraph-ai/issues",
config=graph_config
)
result=smart_scraper_graph.run()
importjsonprint(json.dumps(result, indent=4))
Expected behavior
A clear and concise description of what you expected to happen.
The text was updated successfully, but these errors were encountered:
Hey @axibo-reiner! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this issue together!
I found a similar discussion where a user experienced a 502 Bad Gateway error when using the Ollama API with Scrapegraph-ai, despite the Ollama service running locally and being tested successfully with curl. This issue remains unsolved [1].
For your specific issue with the smart_scraper_graph.run() method hanging when using a remote Ollama server, consider the following:
Timeout Settings: Ensure that the timeout settings are configured appropriately. Scrapegraph-ai has a "graph timeout" and a "timeout" in loader_kwargs that could affect long-running requests [2][3].
Network Configuration: Double-check any network configurations or firewall settings that might be affecting the communication between your local machine and the remote Ollama server. Although specific network configurations weren't found in the documentation, ensuring open communication ports and no firewall restrictions could help.
Logging and Debugging: Utilize the logging functions available in Scrapegraph-ai to diagnose the issue. You can set the verbosity to debug level using set_verbosity_debug() to get more detailed logs, which might provide insights into where the process is hanging [4][5].
These steps might help in identifying the cause of the hang and resolving the issue.
Describe the bug
When running with a remote vs local ollama server the
smart_scrapper_graph.run()
hangs and does not return (left running overnight as well just in case)To Reproduce
Expected behavior
A clear and concise description of what you expected to happen.
The text was updated successfully, but these errors were encountered: