Issue
Im trying make my search method work by list of search keywords.
Is there a way how can I make await asyncio
work though list search
?
async def _request(query: dict):
async with httpx.AsyncClient() as client:
r = await client.post('https://nmmgzjq6qi-2.algolianet.com/1/indexes/public_prod_inventory_track_index/query?x-algolia-agent=Algolia%20for%20JavaScript%20(4.12.0)%3B%20Browser', headers=headers, json=query)
return r.json()
async def to_search(query: str, tags: list[str] = [], page=0, hitsPerPage=100):
data = {
"query": query,
"page": page,
"hitsPerPage": hitsPerPage,
"facets": [
"*"
],
"analytics": True,
"clickAnalytics": True,
"tagFilters": [],
"facetFilters": [
make_tags_filter(tags)
],
"maxValuesPerFacet": hitsPerPage,
"enableABTest": False,
"userToken": userToken,
"filters": "",
"ruleContexts": []
}
return await _request(data)
import asyncio
search = ['coffee', 'banana', 'apple']
#search = input()
for x in search:
r = await asyncio.gather(*[to_search(x, page=i) for i in range(10)])
Also. Is there a way how to make search = input()
inputing list of keywords (assuming split them with comas)
Solution
You can make serveral requests with any parameters the way in the code snippet below. Pay attention that it is the easiest way to do it. If you have many tasks, you have to implement producer-consumers pattern using asyncio.Queue
.
import asyncio
from typing import List
import httpx
async def to_search(url, client: httpx.AsyncClient):
res = await client.get(url)
await asyncio.sleep(3)
return res.status_code
async def main_wrapper(urls: List[str]):
# you need only one AsyncClient for asyncio app.
async with httpx.AsyncClient() as client:
results = await asyncio.gather(*[to_search(i, client) for i in urls])
print(results)
if __name__ == '__main__':
urls = ["http://google.com"] * 20
asyncio.run(main_wrapper(urls=urls))
Answered By - Artiom Kozyrev
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.