How to do anti-crawling in python
A web crawler is a program that automatically extracts web pages. It downloads web pages from the World Wide Web for search engines and is an important component of search engines. But when web crawlers are abused, too many homogeneous things appear on the Internet, and originality cannot be protected. As a result, many websites began to fight against web crawlers and tried every means to protect their content.
1: User-Agent Referer detection (recommended learning: Python video tutorial)
User- Agent is a field in the HTTP protocol, and its role is to describe some information about the terminal that issues the HTTP request.
Enables the server to identify the operating system and version, CPU type, browser and version, browser rendering engine, browser language, browser plug-in, etc. used by the customer.
The server can know who is visiting the website through this field. Block users who are not normal browsers.
Solution:
Disguise the User-Agent of the browser, because the User-Agent of each browser is different, and all users can Use a browser. All UA detection can be solved by conditioning the browser's User-Agent on each request.
Referer is part of the header. When the browser sends a request to the web server, it usually brings the Referer and tells the server Which page did I link to from? For example, some picture websites will detect your Referer value when you request a picture. If the Referer does not match, normal pictures will not be returned.
Solution:
In the request to detect the referer, carry the matching referer value.
2: js obfuscation and rendering
The so-called JavaScript obfuscation is basically:
1. Remove some things that are not actually called The function.
2. Merge scattered variable declarations.
3. Simplification of logical functions.
4. Simplification of variable names. It depends on the pros and cons of different compression tools. Common tools include UglifyJS, JScrambler and other tools.
js rendering is actually the modification of the HTML page. For example, some web pages themselves do not return data. The data is added to HTML after js loading. When encountering this situation, we need to know that the crawler will not perform JavaScript operations. So it needs to be dealt with in other ways.
Solution:
1. Find the key code by reading the website js source code and implement it in python.
2. Find the key code by reading the website js source code, and use PyV8, execjs and other libraries to directly execute the js code.
3. Directly simulate the browser environment through the selenium library
3: IP restriction frequency
WEB systems all use the http protocol to connect to the WEB container Yes, each request will generate at least one TCP connection between the client and the server.
For the server, you can clearly see the requests initiated by an IP address within the unit time.
When the number of requests exceeds a certain value, it can be determined as an abnormal user request.
Solution:
1. Design the IP proxy pool by yourself, and carry a different proxy address with each request through rotation.
2. ADSL dynamic dialing has a unique feature. Every time you dial a number, you get a new IP. That is, its IP is not fixed.
Four: Verification code
Verification code (CAPTCHA) is a "Completely Automated PublicTuring test to tell Computers and HumansApart" ) is a public, fully automated program that distinguishes whether the user is a computer or a human.
It can prevent: malicious cracking of passwords, ticket fraud, forum flooding, and effectively prevents a hacker from making continuous login attempts on a specific registered user using a specific program to violently crack.
This question can be generated and judged by a computer, but only a human can answer it. Since computers cannot answer CAPTCHA questions, the user who answers the questions can be considered a human.
Solution:
1. Manually identify verification codes
2.pytesseract identifies simple verification codes
3.Docking Coding platform
4. Machine learning
For more Python-related technical articles, please visit the Python Tutorial column to learn!
The above is the detailed content of How to do anti-crawling in python. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

PHP is mainly procedural programming, but also supports object-oriented programming (OOP); Python supports a variety of paradigms, including OOP, functional and procedural programming. PHP is suitable for web development, and Python is suitable for a variety of applications such as data analysis and machine learning.

PHP is suitable for web development and rapid prototyping, and Python is suitable for data science and machine learning. 1.PHP is used for dynamic web development, with simple syntax and suitable for rapid development. 2. Python has concise syntax, is suitable for multiple fields, and has a strong library ecosystem.

Python is more suitable for beginners, with a smooth learning curve and concise syntax; JavaScript is suitable for front-end development, with a steep learning curve and flexible syntax. 1. Python syntax is intuitive and suitable for data science and back-end development. 2. JavaScript is flexible and widely used in front-end and server-side programming.

PHP originated in 1994 and was developed by RasmusLerdorf. It was originally used to track website visitors and gradually evolved into a server-side scripting language and was widely used in web development. Python was developed by Guidovan Rossum in the late 1980s and was first released in 1991. It emphasizes code readability and simplicity, and is suitable for scientific computing, data analysis and other fields.

VS Code can run on Windows 8, but the experience may not be great. First make sure the system has been updated to the latest patch, then download the VS Code installation package that matches the system architecture and install it as prompted. After installation, be aware that some extensions may be incompatible with Windows 8 and need to look for alternative extensions or use newer Windows systems in a virtual machine. Install the necessary extensions to check whether they work properly. Although VS Code is feasible on Windows 8, it is recommended to upgrade to a newer Windows system for a better development experience and security.

VS Code can be used to write Python and provides many features that make it an ideal tool for developing Python applications. It allows users to: install Python extensions to get functions such as code completion, syntax highlighting, and debugging. Use the debugger to track code step by step, find and fix errors. Integrate Git for version control. Use code formatting tools to maintain code consistency. Use the Linting tool to spot potential problems ahead of time.

Running Python code in Notepad requires the Python executable and NppExec plug-in to be installed. After installing Python and adding PATH to it, configure the command "python" and the parameter "{CURRENT_DIRECTORY}{FILE_NAME}" in the NppExec plug-in to run Python code in Notepad through the shortcut key "F6".

VS Code extensions pose malicious risks, such as hiding malicious code, exploiting vulnerabilities, and masturbating as legitimate extensions. Methods to identify malicious extensions include: checking publishers, reading comments, checking code, and installing with caution. Security measures also include: security awareness, good habits, regular updates and antivirus software.
