proxygetter
is a Python library that provides a fast and customizable way to scrape, filter, and manage proxies. It's powered by asyncio and aiohttp to validate proxies asynchronously. Originally designed to scrape from sslproxies.org, it now supports customizable sources and multiple filters.
Install the package via pip:
pip install proxygetter
Manage your proxies with ease using the ProxyManager
class.
from proxygetter import ProxyManager
manager = ProxyManager()
Filter proxies based on their validity asynchronously.
valid_proxies = manager.filter_with_validity(url="http://example.com")
Access details about each proxy through the Proxy
class.
proxy = valid_proxies[0]
print(proxy.get_requests_format())
print(proxy.get_selenium_format())
Get proxies using advanced filters like country code, anonymity, https support, Google compatibility, and last checked time.
filtered_proxies = manager.get_proxies(country_code='US', anonymity='elite proxy', https=True, google=True, last_checked_max=600)
You can fetch a random proxy based on specified filters.
random_proxy = manager.get_random_proxy(country_code='US', https=True)
Configure default user agent and timeout using environment variables.
export PROXY_URL_CHECKER_USER_AGENT=your_user_agent
export PROXY_URL_CHECKER_TIMEOUT=your_timeout_value
This project is under the MIT License - see the LICENSE file for details.
- Proxy blacklisting
- Additional proxy databases
- Enhanced documentation and examples
Feel free to contribute or suggest improvements.