Uploading Large Files to SharePoint Library From SPFx Web Part

Introduction

In this blog, we demonstrate how to upload large files to a SharePoint document library from the SPFx web part, even when the file size exceeds 100MB or even reaches the GB range.

SharePoint provides us with the REST API for uploading files to the document library. The API Is {site_url}/_api/web/getfolderbyserverrelativeurl('/sites/{site_name}/{library_name}')/files/add(overwrite=true,url='{file_name}'). However, the issue is that this API only allows us to upload files up to 2MB in size. Any files larger than 2MB cannot be uploaded using this API.

Here I came up with a solution that allows us to upload files larger than 10 MB, going up to GBs, in SharePoint Document Library from the SPFx Web Part. To achieve this, we can use the chunk upload process. The SharePoint REST API provides methods Or query parameters for the chunk upload process, including StartUpload, ContinueUpload, and FinishUpload. Using chunk upload, we can handle uploading any size of the file.

So, Here is The Guide on How we can achieve the Chunk Upload Process In the SPFx web part.

Retrieve the file data from the input and divide it into chunks

  • First, we need to retrieve the user's digest value from the REST API call and store it in a variable. This digest value is used to authenticate the user in our web part.
  • After receiving the file data from our input method, we can store the file object.
  • Please store your preferred chunk size in a variable by initializing it.
    const result = await fetch(
          `${SiteUrl}/_api/contextinfo`,
          {
            method: "POST",
            mode: "same-origin",
            headers: {
              "accept": "application/json"
            }
          }
        );
        const response = await result.json();
        const digest = response["FormDigestValue"];
    
        const file1 = document.getElementById("YourInputFieldID") as HTMLInputElement;
        let file = file1["files"]["0"];
        console.log("File Variable When Uplofing:", file);
    
        const chunkSize = 104857600; // 100 MB in bytes

Make a function that handles the large upload in the SharePoint document library

  • Create a custom function to handle large file uploads in the library. This function requires parameters such as file data, filename, SharePoint site URL, document library name, user's digest value, and desired chunk size.
  • At the beginning of the function, we need to declare some variables, such as the headers to be passed in our REST API, the starting point for uploading, the endpoint, and so on.int, and so on.
  • After that, we need to call another function to started the upload session for file uploading. In this function, we simply add a blank file to our document library to initialize our uploading session. From the API response, we receive the unique ID of the blank file, which helps us identify the file whose content we need to overwrite.
  • After that, we need to generate a unique GUI ID that is used in our method for starting upload, continuing upload, and finishing upload.
  • After that, we have to check the starting position of the file. According to that, we divide the file into chunks. If the condition is true, then we have to call the "Start upload" method with a unique GUI ID, which we have generated, to begin uploading the first chunk of the file to the document library.
  • After uploading the first chunk, we loop through every subsequent chunk of the file and call the "Continue Upload" REST API using the same GUI ID that we used in the "Start Upload" method to upload the chunks. We continue uploading the chunks until we reach the 2nd to last chunk.
  • For the last step, we upload the last chunk of the file using the SharePoint REST API method called "Finish upload". to signal to SharePoint that this is the last chunk of the file, thus completing the uploading process.
    private async UploadLargeFile(
      file: Blob,
      siteUrl: string,
      libraryName: string,
      fileName: string,
      chunkSize: number,
      digest: any
    ) {
      const headers = {
        "Accept": "application/json;odata=verbose",
        "X-RequestDigest": digest
      };
      const fileSize = file.size;
      const uploadId = this.GenrateUploadId();
      let start = 0;
      let end = chunkSize;
      let chunkNumber = 0;
      let fileId = "";
    
      const uploadSessionResponse = await this.StartUploadSession(siteUrl, libraryName, fileName, headers);
      fileId = uploadSessionResponse.d.UniqueId;
    
      while (start < fileSize) {
        const chunk = file.slice(start, end);
        const isLastChunk = end >= fileSize;
    
        if (chunkNumber === 0) {
          await this.UploadFirstChunk(siteUrl, libraryName, fileName, chunk, uploadId, headers, fileId);
        } else if (isLastChunk) {
          await this.UploadLastChunk(siteUrl, libraryName, fileName, chunk, uploadId, headers, start, fileId);
        } else {
          await this.UploadIntermediateChunk(siteUrl, libraryName, fileName, chunk, uploadId, headers, start, fileId);
        }
    
        start = end;
        end = start + chunkSize;
        chunkNumber++;
      }
    }
    
    // Function for the generate unique GUI ID 
    private GenrateUploadId(): string {
      return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, c => {
        const r = Math.random() * 16 | 0;
        const v = c === 'x' ? r : (r & 0x3 | 0x8);
        return v.toString(16);
      });
    }
    
    // Starting Upload Session Method
    private async StartUploadSession(siteUrl: string, libraryName: string, fileName: string, headers: any) {
      try {
        return await this.Retry(async () => {
          const response = await fetch(
            `${siteUrl}/_api/Web/Lists/getByTitle('${libraryName}')/RootFolder/Files/Add(url='${fileName}',overwrite=true)`,
            {
              method: 'POST',
              headers: headers
            }
          );
    
          if (!response.ok) {
            const errorText = await response.text();
            console.error('Failed to start upload session:', errorText);
            throw new Error(`Failed to start upload session: ${errorText}`);
          }
    
          return response.json();
        });
      } catch (error) {
        console.error('Failed to start upload session after retries:', error);
        throw error;
      }
    }

Start Upload Method

  • This method is called when attempting to upload the first chunk of the file to our SharePoint document library.
  • In this method, we called the SharePoint Post REST API with the parameter of the start upload method along with a unique GUI ID.
  • the endpoint for the API is "`${siteUrl}/_api/web/GetFileById('${fileId}')/StartUpload(uploadId=guid'${uploadId}')`".
     private async UploadFirstChunk(
      siteUrl: string,
      libraryName: string,
      fileName: string,
      chunk: any,
      uploadId: string,
      headers: any,
      fileId: string
    ) {
      try {
        return await this.Retry(async () => {
          const response = await fetch(
            `${siteUrl}/_api/web/GetFileById('${fileId}')/StartUpload(uploadId=guid'${uploadId}')`,
            {
              method: 'POST',
              headers: headers,
              body: chunk
            }
          );
    
          if (!response.ok) {
            const errorText = await response.text();
            console.error('Failed to upload first chunk:', errorText);
            throw new Error(`Failed to upload first chunk: ${errorText}`);
          }
    
          return response.json();
        });
      } catch (error) {
        console.error('Failed to upload first chunk after retries:', error);
        await this.CancelUpload(siteUrl, fileId, uploadId, headers);
        await this.DeleteFile(siteUrl, fileId, headers);
        throw error;
      }
    }
    

Continue Upload Method

  • The "Continue Upload" method in SharePoint's REST API allows for the upload of intermediate chunks of a file during a large file upload session.
  • The API endpoint for continuing the upload is: "/_api/web/GetFileById('<fileId>')/ContinueUpload(uploadId=guid'<uploadId>',fileOffset=<start>)".
  • This endpoint specifies the file being uploaded (fileId), the unique upload session ID (uploadId), and the starting byte position of the chunk (fileOffset).
  • The "file Offset" parameter specifies the starting byte position of the chunk being uploaded in the overall file. It helps SharePoint understand where this chunk fits within the entire file.
  • it indicates the position in the file where the current chunk starts.
  • For example, if the first chunk is 1MB (1048576 bytes) in size, the file Offset for the second chunk would be 1048576, the third chunk would be 2097152, and so on.
    private async UploadIntermediateChunk(siteUrl: string, libraryName: string, fileName: string, chunk: any, uploadId: string, headers: any, start: any, fileId: string) {
        try {
          return await this.Retry(async () => {
            const response = await fetch(`${siteUrl}/_api/web/GetFileById('${fileId}')/ContinueUpload(uploadId=guid'${uploadId}',fileOffset=${start})`, {
              method: 'POST',
              headers: headers,
              body: chunk
            });
    
            if (!response.ok) {
              const errorText = await response.text();
              console.error('Failed to upload chunk:', errorText);
              throw new Error(`Failed to upload chunk: ${errorText}`);
            }
            return response.json();
          });
        } catch (error) {
          console.error('Failed to upload intermediate chunk after retries:', error);
          await this.CancelUpload(siteUrl, fileId, uploadId, headers);
          await this.DeleteFile(siteUrl, fileId, headers);
          throw error;
        }
      }

Finish Upload Method

  • The "Finish Upload" method is used to upload the final chunk of a large file to a SharePoint library, signaling the end of the upload process.
  • The method sends a POST request to the SharePoint API endpoint to finish the upload.
  • API endpoint is: "/_api/web/GetFileById('<fileId>')/FinishUpload(uploadId=guid'<uploadId>',fileOffset=<start>)".
      private async UploadLastChunk(siteUrl: string, libraryName: string, fileName: string, chunk: any, uploadId: string, headers: any, start: any, fileId: string) {
        try {
          return await this.Retry(async () => {
            const response = await fetch(`${siteUrl}/_api/web/GetFileById('${fileId}')/FinishUpload(uploadId=guid'${uploadId}',fileOffset=${start})`, {
              method: 'POST',
              headers: headers,
              body: chunk
            });
    
            if (!response.ok) {
              const errorText = await response.text();
              console.error('Failed to upload chunk:', errorText);
              throw new Error(`Failed to upload chunk: ${errorText}`);
            }
    
            return response.json();
          });
        } catch (error) {
          console.error('Failed to upload last chunk after retries:', error);
          await this.CancelUpload(siteUrl, fileId, uploadId, headers);
          await this.DeleteFile(siteUrl, fileId, headers);
          throw error;
        }
      }

Retry Mechanism

  • The Retry method implements a retry mechanism with exponential backoff to handle transient errors when making asynchronous requests.
  • To attempt an asynchronous operation multiple times in case of failure, with increasing delays between each attempt. from this method, you have to pass the same parameters also like "fn", "retries", and "delay".
  • fn: A function that returns a promise. This is the operation that may fail and needs to be retried.
  • retries: The number of times to retry the operation (default is 3).
  • delay: The initial delay in milliseconds before retrying (default is 1000ms or 1 second).
  • If the attempt fails but retries are still available, a warning is logged to the console indicating the failure and the delay before the next attempt.
  • retryAfter is calculated using exponential backoff (delay * Math.pow(2, attempt - 1)). This increases the delay exponentially with each failed attempt.
  • The code waits for the calculated delay using setTimeout.
  • This helps in scenarios where temporary issues may cause failures, giving the operation multiple chances to succeed before ultimately giving up after the specified retries. The exponential backoff strategy helps in reducing the load on the system by spacing out the retries progressively.
     private async Retry<T>(fn: () => Promise<T>, retries: number = 3, delay: number = 1000): Promise<T> {
        let attempt = 1;
        while (attempt <= retries) {
          try {
            return await fn();
          } catch (error) {
            if (attempt === retries) {
              throw error;
            }
            const retryAfter = delay * Math.pow(2, attempt - 1); // Exponential backoff
            console.warn(`Attempt ${attempt} failed. Retrying in ${retryAfter}ms...`);
            await new Promise(resolve => setTimeout(resolve, retryAfter));
            attempt++;
          }
        }
      }

Cancel Upload And Delete File

  • The "Cancel Upload" method is used to cancel an ongoing large file upload session in SharePoint. This is typically done when an error occurs during the upload process, and you want to terminate the session to prevent incomplete or corrupted files from being saved.
  • Sends a request to the SharePoint API to cancel the current upload session identified by uploadId.
  • Utilizes the unique fileId and uploadId to specify which upload session to cancel.
  • Helps in ensuring that partially uploaded files are not left in an inconsistent state.
  • The "Delete File" method is used to delete a file from a SharePoint library. This is usually called after canceling an upload session to remove any partially uploaded files and clean up the SharePoint library.
  • Sends a request to the SharePoint API to delete the file identified by file.
  • Utilizes necessary headers for authentication.
  • Ensures that any incomplete or unwanted file uploads are removed, maintaining the integrity of the document library.
     private async CancelUpload(siteUrl: string, fileId: string, uploadId: string, headers: any) {
        try {
          const response = await fetch(`${siteUrl}/_api/web/GetFileById('${fileId}')/CancelUpload(uploadId=guid'${uploadId}')`, {
            method: 'POST',
            headers: headers
          });
    
          if (!response.ok) {
            const errorText = await response.text();
            console.error('Failed to cancel upload session:', errorText);
            throw new Error(`Failed to cancel upload session: ${errorText}`);
          }
    
        } catch (error) {
          console.error('Error occurred while canceling upload session:', error);
        }
      };
      private async DeleteFile(siteUrl: string, fileId: string, headers: any) {
        try {
          const response = await fetch(`${siteUrl}/_api/web/GetFileById('${fileId}')`, {
            method: 'DELETE',
            headers: headers
          });
    
          if (!response.ok) {
            const errorText = await response.text();
            console.error('Failed to delete file:', errorText);
            throw new Error(`Failed to delete file: ${errorText}`);
          }
        } catch (error) {
          console.error('Error occurred while deleting file:', error);
        }
      }

Summary

This blog covers a method to upload large files to a SharePoint document library by dividing them into smaller chunks. The process involves.

  • Chunked Uploads: Files are split into first, intermediate, and last chunks to be uploaded sequentially.
  • Retry Mechanism: A retry strategy with exponential backoff handles transient errors, ensuring robust uploads.
  • Error Handling: The system cancels and deletes incomplete uploads if errors occur, maintaining library integrity.
  • Support Functions: Additional methods manage the cancellation and cleanup of uploads.

This method ensures efficient and reliable large file uploads to SharePoint.