Skip to content
This repository has been archived by the owner on Apr 24, 2024. It is now read-only.
/ Bard-API Public archive

The unofficial python package that returns response of Google Bard through cookie value.

License

Notifications You must be signed in to change notification settings

dsdanielpark/Bard-API

Repository files navigation

Development Status :: 7 - Inactive


[NOTICE]

Please, go to Gemini Icon Gemini API

A unofficial Python wrapper, python-gemini-api, operates through reverse-engineering, utilizing cookie values to interact with Google Gemini for users struggling with frequent authentication problems or unable to authenticate via Google Authentication.

Collaborated competently with Antonio Cheong.

Installation

pip install python-gemini-api
pip install git+https://github.com/dsdanielpark/Gemini-API.git

For the updated version, use as follows:

pip install -q -U python-gemini-api






Reflection on the Bard API Project #289

Gemini Icon Google - Bard API

PyPI package Code style: black Code style: black PyPI

The python package that returns response of Google Bard through value of cookie.

Please exercise caution and use this package responsibly. This python package is UNOFFICIAL.

I referred to this github repository(github.com/acheong08/Bard) where inference process of Bard was reverse engineered. Using __Secure-1PSID, you can ask questions and get answers from Google Bard. Please note that the bardapi is not a free service, but rather a tool provided to assist developers with testing certain functionalities due to the delayed development and release of Google Bard's API. It has been designed with a lightweight structure that can easily adapt to the emergence of an official API. Therefore, I strongly discourage using it for any other purposes. If you have access to reliable official PaLM-2 API or Google Generative AI API, replace the provided response with the corresponding official code. Check out #262.



What is Google Bard?

Bard is a conversational generative artificial intelligence chatbot developed by Google, based initially on the LaMDA family of LLMs(Large Language Models) and later the PaLM LLM. Please check official documents for updates on Bard, including available regions and languages.

Install

$ pip install bardapi
$ pip install git+https://github.com/dsdanielpark/Bard-API.git

Due to certain dependency packages that are not compatible with 64bit windows(OS), we are releasing a lightweight alpha release of bard that only returns responses for simple requests. This release is a continuation of the pypi 0.1.18 version, which was maintained with lightweight and simple functionality. See alpha-release github branch for more details.

$ pip install bardapi==0.1.23a

Authentication

Warning Do not expose the __Secure-1PSID. For testing purposes only; avoid direct application use. Cookie values change periodically (every 15-20 minutes). Frequent session changes may briefly block access; headless mode is challenging. Rate limiting applies and changes often. If the cookie changes, log out of your Google account, close the browser, and enter the new cookie value. Or manually reset the cookie for a new value. See FAQ and issue pages for details.

  1. Visit https://gemini.google.com/
  2. F12 for console
  3. Session: Application → Cookies → Copy the value of __Secure-1PSID cookie. Or try to use SIDCC as token.

Note that while I referred to __Secure-1PSID or SIDCC value as an API key for convenience, it is not an officially provided API key. Cookie value subject to frequent changes. Verify the value again if an error occurs. Most errors occur when an invalid cookie value is entered.


If you need to set multiple cookie values:

  • Multi-cookie Bard - After confirming that multiple cookie values are required to receive responses reliably in certain countries, I will deploy it for testing purposes. Please debug and create a pull request.

Usage

Open In Colab

Simple Usage

from bardapi import Bard

token = 'xxxxxxx'
bard = Bard(token=token)
bard.get_answer("나와 내 동년배들이 좋아하는 뉴진스에 대해서 알려줘")['content']

Or you can use this

from bardapi import Bard
import os
os.environ['_BARD_API_KEY'] = "xxxxxxx"

Bard().get_answer("나와 내 동년배들이 좋아하는 뉴진스에 대해서 알려줘")['content']

To get reponse dictionary

import bardapi

# set your __Secure-1PSID value to key
token = 'xxxxxxx'

# set your input text
input_text = "나와 내 동년배들이 좋아하는 뉴진스에 대해서 알려줘"

# Send an API request and get a response.
response = bardapi.core.Bard(token).get_answer(input_text)

Addressing errors caused by delayed responses in environments like Google Colab and containers. If an error occurs despite following the proper procedure, utilize the timeout argument.

from bardapi import Bard
import os
os.environ['_BARD_API_KEY']="xxxxxxx"

bard = Bard(timeout=30) # Set timeout in seconds
bard.get_answer("나와 내 동년배들이 좋아하는 뉴진스에 대해서 알려줘")['content']

Further

Behind a proxy

If you are working behind a proxy, use the following.

from bardapi import Bard

# Change 'http://proxy.example.com:8080' to your http proxy
# timeout in seconds
proxies = {
    'http': 'http://proxy.example.com:8080',
    'https': 'https://proxy.example.com:8080'
}

bard = Bard(token='xxxxxxx', proxies=proxies, timeout=30)
bard.get_answer("나와 내 동년배들이 좋아하는 뉴진스에 대해서 알려줘")['content']

Use rotating proxies

If you want to avoid blocked requests and bans, then use Smart Proxy by Crawlbase. It forwards your connection requests to a randomly rotating IP address in a pool of proxies before reaching the target website. The combination of AI and ML make it more effective to avoid CAPTCHAs and blocks.

from bardapi import Bard
import requests

# Get your proxy url at crawlbase https://crawlbase.com/docs/smart-proxy/get/
proxy_url = "http://xxxxxxxxxxxxxx:@smartproxy.crawlbase.com:8012" 
proxies = {"http": proxy_url, "https": proxy_url}

bard = Bard(token='xxxxxxx', proxies=proxies, timeout=30)
bard.get_answer("나와 내 동년배들이 좋아하는 뉴진스에 대해서 알려줘")['content']

Reusable session object

You can continue the conversation using a reusable session. However, this feature is limited, and it is difficult for a package-level feature to perfectly maintain conversation_id and context. You can try to maintain the consistency of conversations same way as other LLM services, such as passing some sort of summary of past conversations to the DB.

from bardapi import Bard
import requests
# import os
# os.environ['_BARD_API_KEY'] = 'xxxxxxx'
token = 'xxxxxxx'

session = requests.Session()
session.headers = {
            "Host": "gemini.google.com",
            "X-Same-Domain": "1",
            "User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.114 Safari/537.36",
            "Content-Type": "application/x-www-form-urlencoded;charset=UTF-8",
            "Origin": "https://gemini.google.com",
            "Referer": "https://gemini.google.com/",
        }
# session.cookies.set("__Secure-1PSID", os.getenv("_BARD_API_KEY")) 
session.cookies.set("__Secure-1PSID", token) 

bard = Bard(token=token, session=session, timeout=30)
bard.get_answer("나와 내 동년배들이 좋아하는 뉴진스에 대해서 알려줘")['content']

# Continued conversation without set new session
bard.get_answer("What is my last prompt??")['content']
Async Bard Code (Click to expand)
from httpx import AsyncClient
from bardapi import BardAsync
import os

# Uncomment and set your API key as needed
# os.environ['_BARD_API_KEY'] = 'xxxxxxx'
token = 'xxxxxxx'  # Replace with your actual token

SESSION_HEADERS = {
    "Host": "gemini.google.com",
    "X-Same-Domain": "1",
    "User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.114 Safari/537.36",
    "Content-Type": "application/x-www-form-urlencoded;charset=UTF-8",
    "Origin": "https://gemini.google.com",
    "Referer": "https://gemini.google.com/",
}
timeout = 30  # Example timeout
proxies = {}  # Replace with your proxies if needed

client = AsyncClient(
    http2=True,
    headers=SESSION_HEADERS,
    cookies={"__Secure-1PSID": token},
    timeout=timeout,
    proxies=proxies,
)

bard_async = BardAsync(token=token, client=client)

# Asynchronous function to get the answer
async def get_bard_answer(question):
    await bard_async.async_setup()  # Ensure async setup is done
    return await bard_async.get_answer(question)

response = await get_bard_answer("나와 내 동년배들이 좋아하는 뉴진스에 대해서 알려줘")
print(response['content'])

Auto Cookie Bard

Using browser_cookie3 we extract the __Secure-1PSID cookie from all browsers, and then we can use the API without passing the token. However, there are still incomplete dependency packages and various variables, so please seek assistance in the following GitHub Issues or adjust your browser's version.

  • Visit https://gemini.google.com/ in your browser and execute the following command while in the chat-enabled state. Refer to browser_cookie3 for details on how it works. If any issues arise, please restart the browser or log in to your Google account again. Recommended to keep the browser open.
from bardapi import Bard

bard = Bard(token_from_browser=True)
response = bard.get_answer("Do you like cookies?")
print(response['content'])

Bard ask_about_image method

It may not work as it is only available for certain accounts, regions, and other restrictions. As an experimental feature, it is possible to ask questions with an image. However, this functionality is only available for accounts with image upload capability in Bard's web UI.

from bardapi import Bard

bard = Bard(token='xxxxxxx')
image = open('image.jpg', 'rb').read() # (jpeg, png, webp) are supported.
bard_answer = bard.ask_about_image('What is in the image?', image)
print(bard_answer['content'])

Business users and high traffic volume may be subject to account restrictions according to Google's policies. Please use the Official Google Cloud API for any other purpose. The user is solely responsible for all code, and it is imperative to consult Google's official services and policies. Furthermore, the code in this repository is provided under the MIT license, and it disclaims any liability, including explicit or implied legal responsibilities.

from bardapi import Bard

bard = Bard(token='xxxxxxx')
audio = bard.speech('Hello, I am Bard! How can I help you today?')
with open("speech.ogg", "wb") as f:
  f.write(bytes(audio['audio']))

Starting from version 0.1.18, the GitHub version of BardAPI will be synchronized with the PyPI version and released simultaneously. However, the version undergoing QA can still be used from the GitHub repository.

$ pip install git+https://github.com/dsdanielpark/Bard-API.git

Amazing Bard Prompts Is All You Need!

  • Helpful prompts for Google Bard

If you want to comfortably use the open-source LLM models in your native language, which are released under the Apache License (allowing free commercial use) in various languages, you can try using the hf-transllm package. hf-transllm also supports multilingual inference for various LLMs stored in hugging face repository.

Example code of hf-transllm

In case the Google package is no longer available due to policy restrictions, here's a simple example code for using open-source language models (LLMs) in both English and multiple languages.

Usage

For the decoder models provided by Hugging Face, you can easily use them by either following a simple approach or overriding the inference method. You can explore various open-source language models at this link. Through the ranking information from Open LLM Leader Board Report repository, you can find information about good models.

For LLM that use languages other than English

from transllm import LLMtranslator

open_llama3b_kor = LLMtranslator('openlm-research/open_llama_3b', target_lang='ko', translator='google') # Korean

trnaslated_answer = open_llama3b_kor.generate("나와 내 동년배들이 좋아하는 뉴진스에 대해서 알려줘")
print(trnaslated_answer)

For LLM that use English

Refer https://github.com/openlm-research/open_llama or using like this:

from transllm import LLMtranslator

open_llama3b = LLMtranslator('openlm-research/open_llama_3b) 

answer = open_llama3b.generate("Tell me about the Korean girl group Newjeans.")
print(answer)

What is Google Gemini?

Gemini or formerly knowns as Bard is an advanced, multimodal AI model by Google DeepMind, capable of understanding and integrating various information types like text, code, audio, images, and video.

Google AI Studio

Google AI Studio creates a new Google Cloud project for each new API key. You also can create an API key in an existing Google Cloud project. All projects are subject to the Google Cloud Platform Terms of Service.

Access to Gemini Pro in Bard API package

The Bard API, sourcing responses from Google BardGemini's official website, allows you to receive the same responses as the website. So, if Gemini answers are available on the web, you can also accessed Gemini through the Bard API. However, it's important to note that responses might also come from other models, not exclusively Gemini Pro or Ultra.

  • There is no official Bard API or early access/waiting list for Gemini, although the PaLM2 API is available.
    • Google's PaLM2 API differs from Bard, with some aspects of Bard being superior.
    • It's speculated that after expert review, Bard Advanced lineup will likely provide an official API in 2024.
  • Gemini and previous generative AI model responses are provided randomly on Bard Web.
  • The Bard API, with its imperfect extension features(e.g, ask_about_image), occasionally demonstrates Gemini's capabilities. This behavior may vary by region, language, or Google account.
  • More information in the FAQ.

For more on Gemini:


Google PaLM

Try demo at https://makersuite.google.com/app/prompts/new_text.

who are you?
>> I am powered by PaLM 2, which stands for Pathways Language Model 2, a large language model from Google AI.

Google Generative AI

Quick Start

$ pip install -q google-generativeai
import pprint
import google.generativeai as palm

palm.configure(api_key='YOUR_API_KEY')

models = [m for m in palm.list_models() if 'generateText' in m.supported_generation_methods]
model = models[0].name
print(model)

prompt = "Who are you?"

completion = palm.generate_text(
    model=model,
    prompt=prompt,
    temperature=0,
    # The maximum length of the response
    max_output_tokens=800,
)
print(completion.result)


Sponsor

Use data scraping to train your AI models.

  • Easy to use API to crawl and scrape millions of websites
  • Use crawlbase for efficient data extraction for your LLMs
  • Average success rate: 98%
  • Uptime guarantee: 99.9%
  • Simple docs to get started in minutes
  • Asynchronous Crawling API if you need massive amounts of data
  • GDPR and CCPA compliant

Used by 70k+ developers.

Please check the FAQ and open issues for similar questions before creating a new issue. Repeated questions will be kept as open issues. Too many requests can trigger a temporary account block (HTTP 429). Maintain proper intervals, using functions like sleep to avoid rate limits. Policies may vary by country and language, so all users could face temporary or permanent errors via the API.

Scripts

In the scripts folder, I have released a script to help you compare OpenAI-ChatGPT, Microsoft-EdgeGPT and Google-Bard. I hope they will help more developers.

Contributors

We would like to express our sincere gratitude to all the contributors.


License

MIT
We hold no legal responsibility; for more information, please refer to the bottom of the readme file. We just want you to give me and them a star. This project is a personal initiative and is not affiliated with or endorsed by Google. It is recommended to use Google's official API.

The MIT License (MIT)

Copyright (c) 2023 Minwoo Park

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

Shifting Service Policies: Bard and Google's Dynamics

Bard's service status and Google's API interfaces are in constant flux. The number of replies is currently limited, but certain users, such as those utilizing VPNs or proxy servers, have reported slightly higher message caps. Adaptability is crucial in navigating these dynamic service policies. Please note that the cookie values used in this package are not official API values.

Bugs and Issues

Sincerely grateful for any reports on new features or bugs. Your valuable feedback on the code is highly appreciated.

Contacts

Reference

[1] https://github.com/acheong08/Bard

Warning Important Notice The user assumes all legal responsibilities associated with using the BardAPI package. This Python package merely facilitates easy access to Google Bard for developers. Users are solely responsible for managing data and using the package appropriately. For further information, please consult the Google Bard Official Document.

Warning Caution This Python package is not an official Google package or API service. It is not affiliated with Google and uses Google account cookies, which means that excessive or commercial usage may result in restrictions on your Google account. The package was created to support developers in testing functionalities due to delays in the official Google package. However, it should not be misused or abused. Please be cautious and refer to the Readme for more information.



Copyright (c) 2023 MinWoo Park, South Korea