File Storage

The File component connects the form to a file storage provider and allows the user to upload, view, and manage files. Since <> does not itself store files, we offer a range of 3rd party integrations to meet your File needs.

Azure Blob

The Azure Blob file upload system allows you to upload files from your hosted forms directly to an Azure Blob storage account. Here are the steps to setup this feature functionality.

Azure Portal

Before we begin, you must create or have an existing account within Microsoft Azure

Create new File Service

  • Once we are in the Azure portal interface, we will want to go click on Storage Accounts and add a new storage account.

  • Next, click on the Create button to create the new storage account.

  • After this operation has been completed, we will now need to click on that storage account, then click on Blobs under the Blob Service section, then click + Container.

  • Next, create a new Container by clicking on +Container and then provide a name for your container.

  • Next, we will need to ensure that we can upload files from the domain of our application, to do this, we will configure the CORS section with the domain where we are hosting our application as well as portal. To do this, we will click on the CORS section, and configure the Blob Storage cors as follows.

  • Now, we will click on Access Keys and copy the following credentials for use later.

Enter Azure Project Settings

  • On go to your Project Settings page and Click on Integrations ❘ File Storage ❘ Azure Blob

  • Enter the information that you copied in the previous configurations as follows.

  • Save your project settings.

  • Now for every File component that you add to a form, you will need to select the Azure File Services Storage dropdown as follows.

Custom Url

Custom Url Provider does not have any settings in the Project Settings. Instead, it is set in the Url field on the form component when it is added to a form.

In order to use a Custom Url Provider, you will need to set up a service that can upload and serve files. See for setting up a compatible service.

The information posted to the server will be

  file: file

The server should save the contents of the file somewhere and return the following object.

  url: '',
  name: 'The_Name_Of_The_File.doc',
  size: 1000

You may return additional attributes if desired.


The S3 Storage provider allows file storage and retrieval using any S3 compatible service but was specifically designed for Amazon Web Services S3.

For an on-premise compatible solution, try minio. See our minio installation guide.

If you haven’t already done so, go to Amazon Web Services and sign up for an account.

In order to use S3, you will need to configure an IAM user and an S3 bucket.

Create an IAM user

  • Go to Services ❘ IAM and click on the Users tab.

  • Click the Add User button.

  • Enter a user name such as “S3” and then click Programmatic Access, then click the Next button.

  • Skip the page where you add the user to a group.

  • On the last page, press Create User

  • On the next page, it should show you your access keys. You will need to copy those and add them to a note on your computer which you will need later.

  • Now press Close button, which will take you back to the users list.

  • Find the user we just created and then click on their name, which will show you their Summary

  • Copy the ARN of that user and save it along with your Access Keys from earlier.

  • We are now ready to create the S3 bucket.

Create an S3 bucket

  • Go to Services ❘ S3

  • Click + Create Bucket and enter a name and region for the bucket and click Next

  • Click Next for all the other pages of the wizard to complete the creation of the bucket. We can configure all of these options after the bucket is created.

  • Next click on that bucket which will take you to the bucket page. Click on the Permissions tab.

  • Now, click on Bucket Policy button and then add the following policy to your bucket.

       "Version": "2012-10-17",
       "Statement": [
           "Sid": "UploadFile",
           "Effect": "Allow",
           "Principal": {
             "AWS": "arn:aws:iam::XXXXXX:user/S3"
           "Action": [
           "Resource": "arn:aws:s3:::formio-upload/*"
  • If you wish to have Public Read access to your files, then you will need to add the following rule to your policy.

        "Sid": "crossdomainAccess",
        "Effect": "Allow",
        "Principal": "*",
        "Action": "s3:GetObject",
        "Resource": "arn:aws:s3:::formio-upload/crossdomain.xml"

    So that it looks like the following.

        "Version": "2012-10-17",
        "Statement": [
              "Sid": "UploadFile",
              "Effect": "Allow",
              "Principal": {
                 "AWS": "arn:aws:iam::XXXXXX:user/S3"
              "Action": [
              "Resource": "arn:aws:s3:::formio-upload/*"
              "Sid": "crossdomainAccess",
              "Effect": "Allow",
              "Principal": "*",
              "Action": "s3:GetObject",
              "Resource": "arn:aws:s3:::formio-upload/crossdomain.xml"
  • Important Note You will need to make sure you replace the arn:aws:iam::XXXXXX:user/S3 with the ARN you copied when you created the user, and also replace the formio-upload with the name of your new bucket.

  • Next click on CORS configuration and add the following.

            "AllowedHeaders": [
            "AllowedMethods": [
            "AllowedOrigins": [
            "ExposeHeaders": [],
            "MaxAgeSeconds": 3000
  • You may replace "AllowOrigins": ["*"] with the domain names of the sites your app will be running on or leave it open.

  • Make sure you save the configuration and the bucket settings.

S3 Bucket Encryption (optional) also supports S3 Encryption to provide further safety with the files that are stored within the S3 system. If you wish to enable this feature, you can do so by click on the Properties tab, then click on Default Encryption and then provide the encryption you would like to use.

Your S3 bucket should now be properly configured.

Enter S3 Project Settings

  • On go to your Project Settings page and Click on Integrations ❘ File Storage ❘ S3 Storage

  • Enter the information for the IAM user and S3 bucket you just created. Make sure to provide all the necessary configurations that match the setup that you configured. For example, like this.

  • The following credentials can be provided.




    Use Minio Server

    Check this if you would like to use these settings to Connect to a Minio Server

    Access Key ID

    This is the IAM user Access Key ID that you copied when setting up the S3 user.


    Secret Access Key

    This is the IAM user Secret Access Key that you copied when setting up the S3 user.


    Bucket Name

    The name of the bucket you created.


    Bucket URL

    If you are using your own S3 compatible server, then provide the url here.

    Bucket Region

    The region which you setup your Bucket within


    Starts With

    This is the top-most folder you wish to place all the files from this project within.


    Access Control List

    Determines the Access Control of the files within this bucket.


    S3 Encryption

    The type of encryption you are using for this S3 bucket.


    KMS Key ID

    Only valid if you are using KMS encryption where you then need to provide your KMS Key ID

    Max Size

    The maximum file size for uploads going into this bucket.


    Policy Expiration

    The amount of seconds that the Upload policy is valid for


  • Now make sure to Save the project settings.

  • Now for every File component that you add to a form, you will need to select the S3 Storage dropdown as follows.

Multipart Upload

AWS S3 offers a Multipart Upload feature that allows users to upload files up to 5TB in size. It can also provide enhanced flexibility when it comes to uploading smaller files.'s File Component supports this feature by letting the form builder opt in to Multipart Upload support on a per form basis.

The Multipart Upload feature is not yet compatible with Minio S3.

To use the File Component with Multipart Upload support:

  • Ensure that your AWS S3 bucket CORS policy exposes the "ETag" header.

  • In the File Component settings, select the checkbox labeled "Use the S3 Multipart Upload API."

  • Enter a "part size" in megabytes. The Multipart Upload feature works by "chunking" the file(s) into parts as close to this size as possible. Although this field is required, you can enter your best estimate based on what kinds of files you believe your form users will be uploading. If the file happens to be smaller than the chunk size, will use a "best-guess" part size to upload the file, so don't worry about being too precise.


The platform supports On-Premise or Private Cloud file hosting through the use of the Minio Server. These instructions will talk you through how to setup and configure your Minio deployment to point to the platform.

Minio Setup

To deploy a new Minio server, you will need to first install Docker either on your local machine or on your privte cloud server. Once you have Docker installed, you can then run the following command to spin up a new Minio instance.

docker run -itd \
  --name formio-minio \
  --restart unless-stopped \
  -p 9000:9000 \
  -v ~/minio/data:/data \
  -v ~/minio/config:/root/.minio \
  minio/minio server /data

In this example, it is important to note that I have Minio mounting a local computer folder @ ~/minio. This can be changed to any internal drive on your machine where you would like to store the Minio files.

Once minio is running, you can then verify that by typing the following.

docker ps

You should see something that looks like the following.

You can also go to the URL http://localhost:9000 in your browser and see the minio interface. You can log in using the MINIO_ACCESS_KEY and MINIO_SECRET_KEY that you used to spin up the minio server.

Once, there you will want to creata a new Bucket by clicking on the + icon toward the bottom right of the screen, and then select Create Bucket. You can give your bucket any name you wish and then press Enter to save.

Keep the name of the bucket for future configurations.

Using NGROK for local testing

If you wish to test this capability locally, it is recommended to use the tool NGROK to create a web accessible proxy to your local host. Once you have NGROK installed, you can then run the following on your machine to create a secure web tunnel to your localhost.

ngrok http 9000

With this running, you should then be able to go to the HTTPS url provided in your browser where you will see your locally running Minio server portal. Keep this URL for the next steps. Configuration

Now that you have a Minio Server running, the next thing you will need to do is configure your portal settings to point to the running Minio server. You can do this by logging into and navigate to your Project. Once there you will then go through the following steps.

  • Go to **Settings ❘ Integrations ❘ File Storage ❘ Amazon S3 / Minio (On-Premise, Private Cloud) **

  • Click on the checkbox that says Use Minio Server

  • Next, configure the Minio settings as follows where you will provide your own configurations.

Press Save to save your settings.

Create a Form that uses Minio

Next, you will need to create a new Webform that has a File component. In the configurations of the file upload, you will want to make sure that you have selected S3 as the Storage provider.

You are now off and running with the Minio!

Google Drive

Google Drive allows for external applications to create and update files within the Google platform. Currently <> provides form actions to create and update fields within Google Sheets and use Google Drive as a storage for the file component. To use the Google Sheet integration, you will need to get an OAuth token for Google Drive and get a spreadsheet ID of the Google Sheet.

To use Google Drive as storage also you may need a folder ID (optional).

Google Drive Project

To get your OAuth token, log in with your Google credentials on Google Developers Console Select the project you would like to use at the top of the page. If you do not have a project, you can use this modal to create a new project by clicking New Project and fill out the required fields.

Once you have created a new project, you will now click on Library and then select the Google Drive API and Google Sheets API.

Now, click Enable button for the Google Drive API and Google Sheets API to activate the APIs

Once, the API is enabled, you will now click on the Credentials section on the right sidebar. Next, you will click on the Create Credentials button drop-down and then select OAuth client ID

Now, you will provide the Web Application information with the Redirect URL set up as and click Create

Save your Client ID and Client Secret

Refresh Token

To create a Refresh Token go to and sign in using your Google Credentials.

Click the gear icon in the upper right corner and check the box labeled Use your own OAuth credentials if it isn’t already checked, then:

Make sure that:

OAuth flow is set to Server-side.

Access type is set to Offline (this will ensure you get a refresh token and an access token instead of just an access token).

For the OAuth Client ID, enter the Client ID obtained above.

For the OAuth Client secret, enter the Client secret obtained above. Under Step 1 on the left-hand side of the page, expand the Google Sheets API v4 and select and scopes and then click Authorize APIs.

The above step will generate an Authorization Code for you. Be sure that ‘Auto-refresh the token before it expires’ is checked and then click on the “Exchange authorization code for tokens” button.

Take note of your Access Token and Refresh token.

In case you have already set up Google Sheets without Google Drive scope, just redo the guide above from the Library step.

Settings Go to your project settings on and go to the Integrations -> Data Connections -> Google Drive tab. Enter the Client ID and Client Secret Key from the first step above. Then enter the Refresh Token from the second step. Save the settings. This will provide access to your files in Google Drive to your application.

Enable Google Drive storage Go to your project settings on and go to the Integrations -> File Storage -> Google Drive tab. Click on the Enable Google Drive button.

File component with Google Drive as storage To set up the File Component to store files in Google Drive, add a File Component, and on the File tab select Google Drive.

To get a Folder ID, go to your Google Drive and copy an ID.

Last updated