Home Web Front-end JS Tutorial Browser file segmented breakpoint upload

Browser file segmented breakpoint upload

Mar 10, 2018 pm 03:52 PM
firefox webkit Browser

This time I will bring you the segmented breakpoint upload of browser files. What are the precautions for uploading browser files with segmented breakpoints? The following is a practical case, let's take a look.

The backend uses Python Flask

Front-end principle implementation:

1. Obtain the file feature code

2. Intercept file information and segment the file
3 .Verify whether there are unfinished uploaded files with the same feature code on the server
4. If there are files with the same feature code, get the upload progress
5. Otherwise, the progress starts from 0
6. Upload asynchronously and sequentially in a loop Segmented file
7. If the upload is completed, the prompt is successful

Back-end principle implementation:

Receive request (file hash) parameter

Judge whether the file is uploaded interrupted

If the hash folder exists, get the number of file segments under the folder and return it to the front end

If it does not exist, return 0 or empty
String 5. If the front end returns the uploaded file segment, save it File segment and identify the index for the file segment

If the upload is completed and the file is merged, delete the file segment

html code

The code takes a single

file upload as an example , use hashMe.js to obtain the feature code

<!DOCTYPE html><html><head>
    <meta charset="UTF-8">
    <title></title>
    <script type="text/javascript" src="http://cdn.bootcss.com/jquery/3.1.1/jquery.min.js"></script>
    <script type="text/javascript" src="md5.js"></script>
    <script src="hashme.js"></script></head><body>
    <input type="file" onchange="hhh(this.files[0])" />
    <button onclick="uploadCk()">测试</button>
    <script>
        var up_f;//需要上传的信息
        var fileSplitSize = 1024 * 1024 * 2; //以2MB为一个分片
        function hhh(f) {             if (true) { //假设这是判断文件大小
                var hash = new hashMe(f, function(msg) {
                    up_f = new Object();
                    up_f.hash = msg;
                    up_f.name = f.name;
                    up_f.size = f.size;
                    up_f.shardCount = Math.ceil(f.size / fileSplitSize); //总片数
                    up_f.shard = [];//文件段
                    for (var i = 0; i < up_f.shardCount; i++) {                        var start = i * fileSplitSize;                        var end = Math.min(f.size, start + fileSplitSize);
                        up_f.shard[up_f.shard.length] = f.slice(start, end);//保存分段
                    }
                });
            }
        }        function uploadCk() { //上传前检查 
            $.ajax({                url: "/upload_ck",                type: "get",                data: {                    hash: up_f.hash
                },                success: function(data) {                    if (data != "") {
                        upload(Number(data));//调用上传(索引为服务器存在的文件段索引)
                    } else { 
                        upload(0);//调用上传
                    }
                }
            });
        }        function upload(loadIndex) { //上传
            var form = new FormData();
            form.append("hash", up_f.hash);
            form.append("name", up_f.name);
            form.append("size", up_f.size);
            form.append("shardCount", up_f.shardCount);
            form.append("blob", up_f.shard[loadIndex]);
            form.append("sdIndex", loadIndex);            console.log("sdIndex:" + loadIndex + ",shardCount:" + up_f.shardCount)
            $.ajax({                url: "/upload",                type: "POST",                data: form,                async: true, 
                processData: false, //很重要,告诉jquery不要对form进行处理
                contentType: false, //很重要,指定为false才能形成正确的Content-Type
                success: function(data) {
                    data = Number(data) + 1;                    if (data <= up_f.shardCount) {                        console.log("data:" + data);
                        upload(data);
                    } else {                        console.log("上传完毕");
                    }
                }
            });
        }    </script></body></html>
Copy after login

Python code

The Python code written for the example is somewhat irregular. Please try not to imitate my writing method (mime download)

from flask import Flask, url_for,request 
import codecs,re,osimport urllib.parse,mimeimport shutilfrom werkzeug.routing import BaseConverterclass RegexConverter(BaseConverter):
    def init(self, map, *args):
        self.map = map
        self.regex = args[0]
        
app = Flask(name)
mim=mime.types
app.config[&#39;UPLOAD_FOLDER&#39;] = &#39;uploads/&#39;#保存文件位置app.url_map.converters[&#39;regex&#39;] = RegexConverter@app.route(&#39;/<regex(".*"):url>&#39;)def index(url):
    ps=urllib.parse.unquote(url)   
    if ps=="upload":        return upload()    elif ps.split(&#39;?&#39;)[0]=="upload_ck":        if os.path.exists("./"+app.config[&#39;UPLOAD_FOLDER&#39;]+str(request.args.get(&#39;hash&#39;) ) ):            return str(len( os.listdir("./"+app.config[&#39;UPLOAD_FOLDER&#39;]+str(request.args.get(&#39;hash&#39;) )) )-1 )#返回文件段索引
        else:            return ""
    bt=codecs.open(ps,&#39;rb&#39;,"utf-8").read() 
    return  bt, 200, {&#39;Content-Type&#39;: mim[url.split(".")[-1]]}@app.route(&#39;/upload&#39;, methods=[&#39;POST&#39;])def upload():
    hashtxt=request.form[&#39;hash&#39;]
    sPs="./"+app.config[&#39;UPLOAD_FOLDER&#39;]+hashtxt+"/"
    if not os.path.exists(sPs):#文件夹不存在
        os.makedirs(sPs)#创建hash文件夹
    uploaded_files = request.files.getlist("blob")#获取文件流集
    filePs=hashtxt+"/"+request.form[&#39;name&#39;]+".part"+request.form[&#39;sdIndex&#39;] #文件段保存路径
    for file in uploaded_files:  
        file.save(os.path.join(app.config[&#39;UPLOAD_FOLDER&#39;],filePs ))#保存文件
    if (int(request.form[&#39;shardCount&#39;]))==(int(request.form[&#39;sdIndex&#39;])):#判断上传完最后一个文件
        mergeFile(app.config[&#39;UPLOAD_FOLDER&#39;],request.form[&#39;name&#39;],hashtxt);#合并文件
        shutil.rmtree("./"+app.config[&#39;UPLOAD_FOLDER&#39;]+hashtxt)#删除
    return request.form[&#39;sdIndex&#39;]#返回段索引
 
        def mergeFile(ps,nm,hs):#合并文件
    temp = open(ps+"/"+nm,&#39;wb&#39;)#创建新文件
    count=len(os.listdir(ps+"/"+hs))    for i in range(0,count):  
        fp = open(ps+"/"+hs+"/"+nm+".part"+str(i), &#39;rb&#39;)#以二进制读取分割文件
        temp.write(fp.read())#写入读取数据
        fp.close()  
    temp.close()with app.test_request_context():    #输出url
    passif name == &#39;main&#39;: 
    app.debug = True
    app.run()
Copy after login

There are so many examples, but the actual problem is not that simple. For example, before uploading and verifying, you can first obtain the file with the same signature and size that already exists in the server, and then directly copy the file to the uploaded directory or prompt whether to overwrite, etc. Of course, you can also optimize, such as uploading segments and then uploading them into segments and then uploading the segments at the same time.

I believe you have mastered the method after reading the case in this article. For more exciting information, please pay attention to other related articles on the php Chinese website!


Related reading:

Website using nodejs for introduction

Assignment and Symbol of ES6 objects

How to create a 1px border effect on the mobile terminal

The above is the detailed content of Browser file segmented breakpoint upload. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

What is apache server? What is apache server for? What is apache server? What is apache server for? Apr 13, 2025 am 11:57 AM

Apache server is a powerful web server software that acts as a bridge between browsers and website servers. 1. It handles HTTP requests and returns web page content based on requests; 2. Modular design allows extended functions, such as support for SSL encryption and dynamic web pages; 3. Configuration files (such as virtual host configurations) need to be carefully set to avoid security vulnerabilities, and optimize performance parameters, such as thread count and timeout time, in order to build high-performance and secure web applications.

Solve caching issues in Craft CMS: Using wiejeben/craft-laravel-mix plug-in Solve caching issues in Craft CMS: Using wiejeben/craft-laravel-mix plug-in Apr 18, 2025 am 09:24 AM

When developing websites using CraftCMS, you often encounter resource file caching problems, especially when you frequently update CSS and JavaScript files, old versions of files may still be cached by the browser, causing users to not see the latest changes in time. This problem not only affects the user experience, but also increases the difficulty of development and debugging. Recently, I encountered similar troubles in my project, and after some exploration, I found the plugin wiejeben/craft-laravel-mix, which perfectly solved my caching problem.

Tips for using HDFS file system on CentOS Tips for using HDFS file system on CentOS Apr 14, 2025 pm 07:30 PM

The Installation, Configuration and Optimization Guide for HDFS File System under CentOS System This article will guide you how to install, configure and optimize Hadoop Distributed File System (HDFS) on CentOS System. HDFS installation and configuration Java environment installation: First, make sure that the appropriate Java environment is installed. Edit /etc/profile file, add the following, and replace /usr/lib/java-1.8.0/jdk1.8.0_144 with your actual Java installation path: exportJAVA_HOME=/usr/lib/java-1.8.0/jdk1.8.0_144exportPATH=$J

Nginx performance monitoring and troubleshooting tools Nginx performance monitoring and troubleshooting tools Apr 13, 2025 pm 10:00 PM

Nginx performance monitoring and troubleshooting are mainly carried out through the following steps: 1. Use nginx-V to view version information, and enable the stub_status module to monitor the number of active connections, requests and cache hit rate; 2. Use top command to monitor system resource occupation, iostat and vmstat monitor disk I/O and memory usage respectively; 3. Use tcpdump to capture packets to analyze network traffic and troubleshoot network connection problems; 4. Properly configure the number of worker processes to avoid insufficient concurrent processing capabilities or excessive process context switching overhead; 5. Correctly configure Nginx cache to avoid improper cache size settings; 6. By analyzing Nginx logs, such as using awk and grep commands or ELK

How to view thread status in Tomcat log How to view thread status in Tomcat log Apr 13, 2025 am 08:36 AM

To view the thread status in the Tomcat log, you can use the following methods: TomcatManagerWeb interface: Enter the management address of Tomcat (usually http://localhost:8080/manager) in the browser, and you can view the status of the thread pool after logging in. JMX Monitoring: Use JMX monitoring tools (such as JConsole) to connect to Tomcat's MBean server to view the status of Tomcat's thread pool. Select in JConsole

How to monitor HDFS status on CentOS How to monitor HDFS status on CentOS Apr 14, 2025 pm 07:33 PM

There are many ways to monitor the status of HDFS (Hadoop Distributed File System) on CentOS systems. This article will introduce several commonly used methods to help you choose the most suitable solution. 1. Use Hadoop’s own WebUI, Hadoop’s own Web interface to provide cluster status monitoring function. Steps: Make sure the Hadoop cluster is up and running. Access the WebUI: Enter http://:50070 (Hadoop2.x) or http://:9870 (Hadoop3.x) in your browser. The default username and password are usually hdfs/hdfs. 2. Command line tool monitoring Hadoop provides a series of command line tools to facilitate monitoring

How to configure HTTPS server in Debian OpenSSL How to configure HTTPS server in Debian OpenSSL Apr 13, 2025 am 11:03 AM

Configuring an HTTPS server on a Debian system involves several steps, including installing the necessary software, generating an SSL certificate, and configuring a web server (such as Apache or Nginx) to use an SSL certificate. Here is a basic guide, assuming you are using an ApacheWeb server. 1. Install the necessary software First, make sure your system is up to date and install Apache and OpenSSL: sudoaptupdatesudoaptupgradesudoaptinsta

Nginx log analysis and statistics to understand website access Nginx log analysis and statistics to understand website access Apr 13, 2025 pm 10:06 PM

This article describes how to analyze Nginx logs to improve website performance and user experience. 1. Understand the Nginx log format, such as timestamps, IP addresses, status codes, etc.; 2. Use tools such as awk to parse logs and count indicators such as visits, error rates, etc.; 3. Write more complex scripts according to needs or use more advanced tools, such as goaccess, to analyze data from different dimensions; 4. For massive logs, consider using distributed frameworks such as Hadoop or Spark. By analyzing logs, you can identify website access patterns, improve content strategies, and ultimately optimize website performance and user experience.

See all articles