This Jupyter Notebook allows you to scrape Google Scholar search results, extract paper information, and create a CSV file with the following columns: Paper Title, Year of Publication, Author, Publication Journal, and URL of the Paper.
Before using this notebook, make sure you have the following:
- Python 3.10 installed
- Required libraries installed (you can install them using
pip
:requests
beautifulsoup4
pandas
-
Clone or download this repository to your local machine.
-
Open the Jupyter Notebook
parsing.ipynb
in your Jupyter Notebook environment. -
Modify the
url
variable to specify your Google Scholar search query. For example, you can change"tirzepatide"
to your desired search term.url = "https://scholar.google.com/scholar?start=0&q=tirzepatide&hl=en&as_sdt=0,5"
Execute the notebook cell by cell. The notebook is divided into sections, each responsible for a specific task:
- Scraping the Google Scholar search results page.
- Extracting paper tags, citation links, and other relevant information.
- Parsing and formatting the data.
- Creating a CSV file with the extracted information.
After executing the entire notebook, the CSV file containing the paper information will be generated. You can find this file in the same directory as the notebook.
You can customize the notebook by modifying the following functions:
-
get_paperinfo(paper_url)
: This function retrieves the content of a Google Scholar page. You can use it to scrape other search results. -
get_tags(doc)
: Modify this function to select different tags or elements from the page source based on your requirements. -
get_papertitle(paper_tag)
: If you want to extract additional information from the paper tags, customize this function. -
get_author_year_publi_info(authors_tag)
: Adjust this function to extract different information from the author tags.
If you encounter any issues or have suggestions for improvements, please open an issue in this GitHub repository. Contributions and pull requests are welcome!