LLMSearch: Gemini-Powered Google Maps Exploration Tool with Apify Integration

Unleash the power of AI to explore the world around you with LLMSearch, a cutting-edge tool that seamlessly integrates Google's Gemini model and the Apify platform to revolutionize your Google Maps experience. Developed by Muhammad Dava Pasha, LLMSearch allows you to effortlessly discover places to go, eat, and more, all powered by the intelligence of Gemini and the comprehensive data of Google Maps.

Key Features and Benefits

  • Gemini-Powered Search: Leverage the advanced natural language processing capabilities of Gemini to refine your Google Maps searches. Ask complex questions and receive intelligent recommendations tailored to your specific needs.
  • Apify Integration: LLMSearch utilizes Apify to extract and structure data from Google Maps, providing you with comprehensive and organized search results.
  • Effortless Exploration: Discover new restaurants, attractions, and points of interest with ease. LLMSearch helps you find hidden gems and make informed decisions about where to go.
  • Streamlit Interface: Interact with LLMSearch through a user-friendly Streamlit application, making it accessible to everyone.

Example Use Cases

Imagine you're planning a trip and want to find the best Italian restaurants in Rome with outdoor seating and high ratings. Simply ask LLMSearch, and it will use Gemini to understand your request, then leverage Apify to extract relevant data from Google Maps and present you with a curated list of options.

Here's a glimpse of what LLMSearch can do (See the Example Image in the Repository):

  • Finding top-rated coffee shops with Wi-Fi near you.
  • Discovering hidden hiking trails with stunning views.
  • Locating pet-friendly accommodations in a new city.

Getting Started

Ready to experience the future of Google Maps exploration? Here's how to get started with LLMSearch:

1. Installation

First, create a virtual environment using Conda to manage dependencies:

conda create -n llmsearch python=3.10
conda activate llmsearch

Then create requirements.txt file and put this

streamlit==1.45.0
apify_client==1.10.0
google-genai==1.14.0
python-dotenv==1.1.0

After that run the installation

pip install -r requirements.txt

2. Environment Setup

Next, configure your environment variables by creating a .env file in the root directory of the project. You'll need API keys from Google AI Studio and Apify:

GEMINI_API_KEY=YOUR_API_KEY
APIFY_API_KEY=YOUR_API_KEY

3. Running the Application

Finally, launch the LLMSearch application using Streamlit:

streamlit run app.py

This will open the LLMSearch interface in your web browser, where you can start exploring Google Maps with the power of Gemini!

Diving Deeper into the Code

Let's break down some key parts of the code:

Google Maps Search Tool

The GoogleMapSearch function is the heart of the integration with Google Maps. It takes a searchString (e.g., "best pizza", "dog friendly parks") and a location (e.g., "New York, USA") as input. It then uses the Apify Google Maps Scraper actor to extract relevant information, including:

  • place_name: The name of the place.
  • total_score: The average rating of the place.
  • reviewsCount: The number of reviews.
  • url: A link to the place on Google Maps.

The function returns a list of dictionaries, each representing a Google Maps result, formatted in markdown for easy readability. The inclusion of the URL is crucial for allowing users to quickly navigate to the place on Google Maps.

def GoogleMapSearch(searchString: list[str], location: str):
    """Returns google map search result in object below
    place_name = The place name
    total_score = The total score of that place
    reviewsCount = How many given reviews
    url = The url where user can click

    Please format them in markdown for easy to read and always include the url in response!

    Args:
      searchString: Type what you’d normally search for in the Google Maps search bar, like English breakfast or pet shelter. Aim for unique terms for faster processing. Using similar terms (e.g., bar vs. restaurant vs. cafe) may slightly increase your capture rate but is less efficient.
      location: Define location using free text. Simpler formats work best; e.g., use City + Country rather than City + Country + State.
    """
    run_input = {
        "searchStringsArray": searchString,
        "locationQuery": location,
        "maxCrawledPlacesPerSearch": 10,
        # ... other Apify actor input parameters ...
    }

    run = apify_client.actor("nwua9Gu5YrADL7ZDj").call(run_input=run_input)

    result = []
    for item in apify_client.dataset(run["defaultDatasetId"]).iterate_items():
        result.append({
            "place_name": item["title"],
            "total_score": item["totalScore"],
            "reviewsCount": item["reviewsCount"],
            "url": item["url"]
        })

    return result

Gemini Integration

The generateChat function connects to the Gemini model and uses it to generate responses based on user prompts. Crucially, it defines the GoogleMapSearch function as a tool available to the Gemini model. This allows Gemini to dynamically call the GoogleMapSearch function when it needs to retrieve information from Google Maps.

def generateChat():
    response = client.models.generate_content_stream(
        model = "gemini-2.0-flash",
        contents = prompt,
        config=types.GenerateContentConfig(tools=[GoogleMapSearch])
    )
    
    for chunk in response:
        yield chunk.text

Streamlit Application

The Streamlit application provides a user-friendly interface for interacting with LLMSearch. It allows users to enter prompts, displays the chat history, and streams the responses generated by Gemini.

st.title("LLM Search made By Muhammad Dava Pasha")

# Initialize chat history
if "messages" not in st.session_state:
    st.session_state.messages = []

for message in st.session_state.messages:
    with st.chat_message(message["role"]):
        st.markdown(message["content"])

if prompt := st.chat_input("How can i help you today?"):
    st.session_state.messages.append({"role": "user", "content": prompt})
    
    with st.chat_message("user"):
        st.markdown(prompt)

    with st.chat_message("assistant"):
        response = st.write_stream(generateChat)


    st.session_state.messages.append({"role": "assistant", "content": response})

Contributing

LLMSearch is an open-source project, and contributions are welcome! Feel free to fork the repository, submit pull requests, or open issues to report bugs or suggest new features.

Conclusion

LLMSearch represents a significant step forward in how we interact with and explore Google Maps. By combining the power of Gemini and Apify, it offers a seamless and intelligent way to discover the world around us. Give it a try and unlock a new level of exploration!

You can found full source code at https://github.com/ImCodingCat/LLMSearch

Subscribe to TheProCat

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
[email protected]
Subscribe