Maintenance and Migration

Upgrading Deployments

Once you have deployed the platform into your environment, it will become a common practice to upgrade to the latest versions of the software. Since this does not happen automatically, you will need to become familiar with the process.

It should be noted that many of the Cloud Platforms such as AWS and Azure provide their own methods for upgrading container deployments, and those methods should be used and the documentation for such processes will be provided by the Cloud Platforms. Because of this, this section will describe how to upgrade a local deployment of the platform.

Before upgrading, it should be noted that the platform is a "stateless" system, which means that each of the containers provided by does not maintain any files, sessions, or any other data that would "go away" between upgrades. Because of this, you can consider each deployment as an ephemeral system that can be removed from the load balancer without any effect on the stability of the overall system (if more than one instance resides behind a load balancer). It is for this reason, that recommends that all non-development environments contain more than a single container instance behind a load balancer so that updates can be performed in a "rolling" fashion where each container is removed from the load-balancer, updated, and then added back into the load balancer once the update is complete.

At times, servers can get in a bad state when multiple instances are behind a load balancer and software is upgraded synchronously. Perform software upgrades one instance at a time to avoid any potential issues.

Major Version Upgrades

When performing an upgrade, you should be aware of when you are performing a "major" version upgrade. This is anytime you are upgrading to a version number where the leftmost number in the semantic version has been incremented. For example, if you are updating your 6.10.0 server to version 7.1.0, this is considered a "major" upgrade, whereas upgrading from 6.10.0 to 6.11.0 is considered a "minor" version upgrade. These semantic versions are positioned in a way to indicate the "risk" associated with the upgrade and should be taken into consideration. Here are the numbers and what they mean.

  • 6.10.2 - The 2 here is considered the "patch" version. Increments of this version are considered "patch" releases and are the least risky versions. They only include minor bug fixes against the current minor release.

  • 6.10.2 - The 10 here is considered the "minor" version. Increments of this version are considered "minor" releases, which include new minor versions of the renderer which could introduce new features, etc.

  • 6.10.2 - The 6 here is considered the "major" version. Increments of this number are considered "major" releases. They only occur rarely (usually once per year) and include major refactoring and improvements to the platform. There may be some reverse compatibility breaks in these versions so migrations to new major versions must be done with great care.

When upgrading major versions, where there is more than one integer difference between, them (such as if you are upgrading from 5.0.0 to 7.0.0), then you must perform two separate migrations. Once from 5.0.0 to the latest version of 6.x such as 6.11.1. And then finally migrate from the 6.x version to the latest version of 7.x such as 7.1.0. Never migrate two major versions in a single upgrade process.

Adherence to Reverse Compatibility

For every release that we issue, it is our utmost highest priority to maintain reverse compatibility, even with Major version releases. It is VERY rare that we will issue any release where reverse compatibility is broken, especially within the Form JSON schema that is used to render forms. Because of this, it is generally safe to upgrade your software without concern that your forms will "break". If any refactoring does occur, they will usually be paired with an update hook that will automatically update the database to the correct Schema version so that the behavior of the server is maintained. Please see Resolving Update Failures for more explanation on the updated schema.

Database Backup

Before you begin upgrading your server, it is important to create a Database backup of your deployment so that you can restore from the backup if anything should go wrong. It is rare that this would occur, but this is a good measure to ensure that you have minimal downtime in the event that a problem should arise during the upgrade process.

The platform only depends on MongoDB for any "state" that is stored within the server, so as long as you have backed up the database, the upgrade can continue. There are two ways to perform a backup.

  • mongodump - This should be used if you are planning on keeping the same MongoDB database between upgrades. This maintains all indexes and internal metadata about your database.

  • mongoexport - This should be used if you plan on switching databases during the export process. This is more of a JSON export that will store all your records within JSON files that can then be re-imported into a new database. This is useful for importing the database into a new database where new indexes will be created.

Please refer MongoDB documentation for both of these options. For most upgrade processes, we would recommend that mongodump always be used, and only if situations arise in which it cannot be used, then a mongoexport could be used as a backup plan. In either case, it does not hurt to perform both operations before an upgrade process.

Upgrading Enterprise Server

Once your database has been backed up, you can now update your API server.

To perform an update with a Docker container system, you just actually stop the currently running container, remove it, and then re-register the new version utilizing all of the same environment variables that were used when originally deploying. The following commands illustrate how this can be done on a per-instance basis to perform a manual upgrade within a single environment.

If you do not know the values that were used when originally deploying the container, you can use the following command to determine the values of these environment variables.

docker inspect formio-server

Once you have the values of these environment variables, upgrading is easily achieved with the following command.

docker pull formio/formio-enterprise && \
docker rm formio-server-old || true && \
docker stop formio-server && \
docker rename formio-server formio-server-old && \
docker run -d \
  -e "MONGO=mongodb://mongo:27017/formio" \
  -e "PORTAL_ENABLED=true" \
  -e "" \
  -e "PDF_SERVER=http://pdf-server:4005" \
  --restart unless-stopped \
  --network formio \
  --link pdf-server:pdf-server \
  --name formio-server \
  -p 3000:80 \

After you run this command, you will then want to inspect the logs by typing the following.

docker logs formio-server

For updates that require a database update, you may see something that looks like the following when upgrading.

You will need to ensure that the updates run smoothly and all complete. If any update fails, then you will need to follow these instructions to ensure you are able to get through the updates.

Resolving Update Failures

If any of the updates fail to execute, there is a series of steps that can be taken to help resolve the problem. To understand how to resolve the update failures, it is important to understand how our DB schema's work and how to get the server back up and running.

When an update is started, the first thing that our server does is compare the code schema version with the version that is present in the database. The code schema version can be found by looking at the package.json file within your deployment codebase and looking at the "schema" property of that file. This indicates the "db schema" of the codebase. This value is compared with the value that is found within the database by running the following command.


Running this command within your database connection would provide you with the following result.

> db.schema.find({key:'formio'}).pretty();
	"_id" : ObjectId("55cd5c1f2c4aaf01001fe799"),
	"key" : "formio",
	"isLocked" : false,
	"version" : "3.3.9"

This tells us that our database is currently on the 3.3.9 version and is not locked. If the "code" schema version were a larger version number than this, this would indicate that an update needs to be run.

If an update fails, then you will see that the "isLocked" property is set to a timestamp, and that the "version" is set to the last successful update.

If the error "DB is already locked for updating" is seen in a failed upgrade, the following steps will resolve the issue.

In the event of a problem, the first thing we will do is reset the "isLocked" flag to "null" so that we can re-try the update by doing the following.


After we have done this, we will then re-try our update by restarting our docker container

docker restart formio-server

Once it has restarted, you will then inspect the logs to see if it has moved past the "stuck" update. If it has not, then we can bump the version of the update by one patch version and then retry as follows. Here is what we would run if we were stuck on update 3.3.7.


You will then need to restart the server as follows.

docker restart formio-server

In most cases, this will not have any ill effects, but if you run into this scenario, please reach out to Support so that we can provide you with the "manual" update script you had to skip so that are able to ensure all updates are applied cleanly to your deployment.

Upgrading PDF Server

Upgrading a PDF Server can be done by using the following command.

docker pull formio/pdf-server && \
docker rm pdf-server-old || true && \
docker stop pdf-server && \
docker rename pdf-server pdf-server-old && \
docker run -itd \
  -e "MONGO=mongodb://mongo:27017/formio" \
  -e "FORMIO_S3_SERVER=minio" \
  -e "FORMIO_S3_PORT=9000" \
  -e "FORMIO_S3_BUCKET=formio" \
  --network formio \
  --link formio-mongo:mongo \
  --link formio-minio:minio \
  --restart unless-stopped \
  --name pdf-server \
  -p 4005:4005 \

The same process will need to be followed as described in the API Server upgrade to ensure that the update was processed cleanly. If not, then you can follow the same steps provided in the Resolving Update Failures section to ensure that your deployment is running smoothly.

Database Transfer Between Servers

There might be occasions when it becomes necessary to migrate a database from one server or environment to another. Follow the guide below on how to clone your database to different servers.

  1. In the destination Mongo database, determine the "mount" folder by running the following command. docker inspect <MONGO_CONTAINER_NAME>

  2. Find the "Mounts" for the database. They should look like the following:

  "Type": "bind",
  "Source": "/Users/travistidwell/data/db",
  "Destination": "/data/db",
  "Mode": "",
  "RW": true,
  "Propagation": "rprivate"
  1. Take note of the "Source" and "Destination" folders

  2. SSH into the source MongoDB container using the following command. docker exec -it <MONGO_CONTAINER_NAME> /bin/bash

  3. Perform a mongodump of the database, and place it in the "Destination" folder above. This assumes the database name is "formio". mongodump --archive=/data/db/backup.archive --db=formio

  4. Exit the container (by typing "exit" and press enter) and then verify that the "archive" in the "Source" folder looks like the following:

  1. Next, "scp" this file from the Source machine to the Destination machine by inputting the following: scp source:~/data/db/backup.archive destination:~/data/db

  2. SSH into the destination machine and ensure the Docker Containers have "stopped".

  3. Ensure that the Environment variables for these containers are the same as the "source" container environment variables.

  4. Within the destination machine, find the "Source" and "Destination" folders of the running docker mongo container similar to step #4.

  5. Move the archive file into the "Source" folder of the destination Mongo container folder like so. cp ~/data/db/backup.archive ~/opt/mongodb

  6. Bash into your destination container: docker exec -it <CONTAINER_NAME> /bin/bash

  7. Finally, restore the backup to the destination database. mongorestore --archive=<DESTINATION_FOLDER>/backup.archive --db=formio

Migrating Projects

The following documentation describes how to perform different kinds of migrations within the platform. Throughout this documentation, we will refer to both Source and Destination projects, where the source is the project where you would like to migrate FROM while the Destination is the project you would like to migrate into.

Migrating from one project to another can easily be achieved with a combination of the Staging system as well as the CLI tool. The Staging system is used to migrate the Forms, Resources, and all Project level configurations. It is used to migrate everything EXCEPT Submissions and Settings.

Migrating Forms, Resources, Actions, and Roles

To start, we will first export the project of the Source Project using the staging interface. This can be found by clicking on the Source Project in the Developer portal, and then clicking on Settings | Stage Versions.

For a complete migration, we will export the whole template, so just click on the Export Template button. This will download a JSON file onto your local machine, which we will use to migrate to our new project.

It is also possible to use the Export Template system to only migrate single Forms and Resources into a destination project by clicking on the Include All checkbox, and then only select the Forms and Resources you wish to migrate.

Now that we have our export JSON file on our local machine, we will now either create a new project with this JSON template, or we can also update an existing project with this template.

To create a new Destination project, simply click on the Create Project button in the home page, and then under the Template Settings, we will select our template JSON file.

To update an existing Destination project, you will just click on the Destination project, then go to Settings | Stage Versions (like above), and then click on the Import Template section. Once you are here, you will then click Choose File and then select the template exported from the Source project. Then, you will click the Import Project Template button to complete the import.

This will clone the Forms, Resources, Roles, and Actions into this Destination Project. Now we are ready to migrate the submissions using the CLI tool.

Migrating Submissions

To migrate submissions, we will now use the CLI tool, which can be found @ We can download this tool to our local machines using the following command.

npm install -g formio-cli

With the CLI tool now on our local machines, we will now need to make sure that we have an API key configured for both the Source and Destination Projects. For both of these projects, we will navigate to the Settings | API Keys section and create an API key.

Once you have an API key for both the Source and Destination keys, you can now use the following command from your local computer terminal.

formio migrate \
    project \ \
    --src-key [SOURCE_API_KEY] \
    --dst-key [DESTINATION_API_KEY]This command will migrate submission data from a hosted project to a deployed project within your own environment

This will now copy all submissions from the Source project into the Destination project.

Migrating Project Settings (optional)

If you wish to migrate the project settings, you will need to use the Project APIs to perform this. The APIs that will be used and the step process will be as follows.

  1. GET source project using Project GET API

  2. GET destination project using Project GET API

  3. Copy the Destination project API Keys and save for later

  4. Copy the Source project settings, and set them as the Destination settings.

  5. Copy the Destination project keys back into the Destination settings

  6. Perform a PUT Request to save the settings into the Destination settings.

The following shell command performs all of the steps above, and can be used to migrate project settings from one project into another while maintaining the same API Keys for future migrations.

SOURCE_URL='' && \
DEST_URL='' && \
SOURCE_PROJECT=$(curl --location \
    --request GET $SOURCE_URL \
    --header "x-token: $SOURCE_APIKEY"\
) && \
DEST_PROJECT=$(curl --location \
    --request GET $DEST_URL \
    --header "x-token: $DEST_APIKEY"\
) && \
UPDATE=$(node -e "\
    const dst=$DEST_PROJECT; \
    const src=$SOURCE_PROJECT; \
    const keys=dst.settings.keys; \
    dst.settings=src.settings; \
    dst.settings.keys = keys;
) && \
curl --location \
    --request PUT $DEST_URL \
    --header "x-token: $DEST_APIKEY" \
    --header 'Content-Type: application/json' \
    --data-raw "$UPDATE"

Enabling the Developer Portal on Existing environment

The platform allows you to use the Hosted portal @ to connect to your remote environments through the On-Premise Environments section within the project. This is useful to allow your remote environment to serve as an API-only interface for your applications, while at the same time, managing that deployment through a hosted portal interface. In some cases, though, you may wish to enable the developer portal within a remote environment by introducing the following environment variables.


It is also no longer necessary to use the PORTAL_SECRET environment variable, so you may also now remove this variable since you will no longer be connecting to this environment with a remote portal.

Once this is done, you will need to restart the server so that the initialization process will install the Portal Base project as well as create the initial admin account for this project.

The Portal Base project is a special project that is used to control the portal application. Any users that can log into the portal are added to the User resource within this project, and anyone with the Authenticated role within this project will have the ability to log in and create new projects.

After the portal has been enabled, you can now login to the portal, by just navigating to the root URL of the deployed API. Once you log in, you will probably notice that you do not see any of your existing projects. Do not worry, they are still there, but the "owner" of these projects needs to be established so that they show up when you log in as the root user account. To do this, you will need to first connect to your MongoDB database, and then run the following command.

var account = db.submissions.find({'': ''}).next();
db.projects.updateMany({}, {$set:{owner:account._id}});

This command will now set all of the projects to have the "owner" of the account that was created within the Portal Base project. Now, when you log into your server, you will see all of your existing projects within that environment show up so that they can be managed accordingly.

Community to Enterprise Migrations

If you wish to deploy all of your forms and resources from the Community Edition into the Hosted platform @, you can do this by using the CLI command line tool.

npm install -g formio-cli

Once you have this tool installed, you will need to follow these steps.

  • Create a new project within

  • Create an API Key within this project by going to the Project Settings | Stage Settings | API Keys

  • Next, you can execute the following command to deploy your local project into Hosted

formio deploy http://localhost:3001 https://{PROJECTNAME} --dst-key={APIKEY}

You will need to make sure you replace {PROJECTNAME} and {APIKEY} with your new Hosted project name (found in the API url), as well as the API key that was created in the second step above.

This will then ask you to log into the local server (which can be provided within the Admin resource), and then after it authenticates, it will export the project and deploy that project to the hosted form.

Next, all submissions can be migrated by following the Migrating Submissions documentation.

API & PDF Server Migrations

Be sure to review the Update Guide before migrating major versions of your API and PDF servers. Before commencing the migration process, it is recommended to create a complete backup of your existing environment(s).

Please visit the Enterprise Change Log for a full list of changes and fixes

API Server 8.x to 9.x

While there are several major changes with the 9.0.0 release, one of the primary goals for this release is to maintain an easy upgrade path from 8.x versions. There are several important points that have been implemented with the 9.0.0 release to ensure that the migration from 8.0.0 is a quick and easy transition.

  • No database upgrade scripts or schema changes With the 9.0.0 release, there are no schema changes or upgrade scripts that will be performed on deployments during an upgrade from 8.x to 9.0.0

  • No major Developer Portal, Formio.js Renderer, or Form Builder changes We encourage the upgrade to 9.0.0 to take advantage of the security enhancements and CVE resolutions as soon as possible. To enable an easy upgrade path, the first 9.0.0 version contains only the necessary upgrades to dependencies and libraries and is without any major changes to the Developer Portal Application, Formio.js Renderer, or Form Builder since 8.x.

9.x Changes

Most of the changes for 9.0.0 pertain to security updates, performance improvements, major library dependency and runtime upgrades. The following is a detailed list of all major changes that have been made for the 9.0.0 release:

  • VM2 replaced with Isolated-VM One of the instigating motivations for releasing a new major version was the recent deprecation of the heavily depended on library VM2. Before 9.0.0, this library was heavily used to ensure proper sandbox execution of any server-side JavaScript evaluations that would occur within a number of Form features. The following server-side evaluations were previously executed within the VM2 runtime:

    • Form Component: Calculated Values w/ “Calculate on Server” enabled

    • Form Component: Custom Default Values

    • Form Component: Advanced Logic w/ Custom triggers or actions

    • Form Component: Custom Conditionals

    • Form Component: Custom Validations

    • Form Component: Select Available Items Validation

    • Form Actions: Email Action template rendering

    • Form Actions: Save Submission Transform

    • Form Actions: Custom Action Conditions

    • Form Actions: Webhook Action Transforms

    • Project Settings: Token Parse

    Each of these systems relies on a secure JavaScript evaluation context to securely execute JavaScript within a sandboxed environment. Due to VM2 being deprecated, Isolated VM was selected to replace this library:

    This library replacement also required a refactor of the Server Side data processing system. Previously, the Javascript renderer, Formio.js, was leveraged as the mechanism to perform this validation within VM2, but this was no longer viable considering the level of protection surrounding evaluation contexts within Isolated VM.

    Therefore a new Submission Data Processing system was developed, which was released under the @formio/core library. The code behind this new system is Open Source and can be found @

The following methods are listed below as either available or "no-op" in 9.x. This applies only to the server side evaluation context.

component instance methods available:

  • get root

  • get component

  • get currentForm

  • get data

  • get parent

  • get dataValue (getter)

  • set dataValue (setter)

  • getValue()

  • setValue()

  • isEmpty()

component instance no-op methods:

  • get schema

  • get options

  • on()

  • off()

  • render()

  • redraw()

  • ready()

  • init()

  • destroy()

  • teardown()

  • attach()

  • detach()

  • build()

  • t(text)

  • sanitize(dirty)

  • renderString(template)

instance.root methods available:

  • getComponent(path)

  • get submission

instance.root no-ops:

  • set submission

  • set form

  • get root

instance.root fields available:

  • data (submission data)

Other updates in 9.0.0

  • New Server Validation Runtime Along with the new data processing system is a new validation runtime for every submission that is processed on the server. This system has been refactored to no longer use the full “formiojs” renderer on the server, but instead use a more dedicated data processing system provided by our core validation engine found @ This change will improve performance as well as memory allocation when new submissions are sent to the server.

  • Upgrade to Node v20 As part of the upgrade for dependencies and data processing, we are also moving to use the Node v20 runtime within the Docker containers that run our Enterprise deployments. Node v20 includes several performance and security improvements, which can be found in their release notes found @

While it is absolutely our intention that reverse compatibility is upheld during this transition, it will be critical to test submitting forms to ensure the validation processing works as expected before deploying 9.0.0 into a production environment.

9.x Breaking Changes

There are a few breaking changes with the 9.x upgrade you should be aware of.

  • Buttons are now included within the submission object

        "textField": "Test",
        "submit": true

API Server 7.x to 8.x

Migrate your 7.x API Server to the latest 8.x API Server with ease by following our update guide.

8.x Breaking Changes

There are a few breaking changes with this upgrade you should be aware of:

Developer Portal Users

Admin accounts work as they do in 7.x where Admins can Create new Projects and Teams.

There is an automatic update hook upon upgrade to API 8.0.0 that affects the original user created with the environment variables ADMIN_EMAIL and ADMIN_PASS. The update hook migrates this original user from the User Resource to the Admin Resource in the Portal Base Project.

Any external scripts that assume the User resource to contain this user would need to be updated to point to the Admin resource.

Webhook Improvements

Payload Transformations

Modify the data payload being sent by the Webhook before it is sent using JavaScript. This allows you to alter the payload to map to any external service interfaces.

Proper Error Handling and Logging

The webhook action now features improved handling of errors and logging to ensure that there is visibility into how the webhook is operating when pointed to the 3rd party services.

Action Deprecation: Because of the Webhook Improvements, it is now possible to integrate with a wide variety of 3rd party services using nothing more than the Webhook action, resulting in several existing integration actions now being redundant. With this in mind, the following action items are being deprecated:

Deprecated actions have been removed from the Portal UI but will still function on forms where the action was previously configured

Actions: Office 365, Jira, Twillio, Hubspot

These actions will be removed in favor of the Webhook Action

Action: Reset Password

Customers are encouraged to utilize our Reset Password workflow solution.

Dropbox integration

Our webhooks allow customers the flexibility to use whichever services they wish to integrate

Signature Relocation

With the addition of Box Sign, the signature component has been relocated to the Premium components tab.

8.x Features

There are also many new features included in this release designed to provide users with reliability, platform extensibility, and (more) ease of use. Please check out these new capabilities below:

Portal Admin Permissions

Portal Admins now have the ability to see and access all projects within the Developer Portal, regardless of who created them. This will also show all Environment Stages for that deployment even if the parent project does not reside within that environment.

Previously, anyone who had access to the Developer Portal had the ability to create their own projects. Additionally, these Projects could only be seen by the individuals who created them, and could not be seen by anyone else, including the Admin(s) of the Developer Portal. This created confusion and problems with controlling the number of active projects under the license.

To address this we have split access to the Developer Portal into two separate roles; Admins and Users, and provided them with different Project creation permissions and capabilities.

User accounts can now be configured to remove their ability to Create Projects and Teams via the environment variable ONLY_PRIMARY_WRITE_ACCESS.

Form Building and Form Management:

Google Drive Integration

Use Google Drive as storage for the file component.

Google Sheet action to create and update fields within your own Google Sheets.

Migration of Form Revisions

Form Revisions now migrate to destination stages during Stage Deployments. All references to revisions within nested form definitions are updated to include the correct revisions that are being transferred.

Two Factor Authentication

2FA can now be enabled for logging into the Developer portal as well as on a per-project basis to enable 2FA within your application.

Populate Resources and Forms from CSV now enables an easy-to-use CSV Upload Feature within the Developer Portal to pre-populate Submissions against any Form or Resource with a single click of a button!

New Conditional Show & Hide

As of server release 8.1.0, has released a code-free Conditional Show/Hide UI using simple dropdown fields. This update relieves the need to write custom Javascript when the Show/Hide workflow is dependent on multiple fields or values. Read More Here Security & Compliance Module:

Submission Revisions, Custom Collections, Localization of the Developer Portal, and more! Read More Here

If you are using any features that are within the Security Module, please ensure you have this Module as part of your subscription in order to use these features. Please send us an email if you have any questions, or to have this module added to your subscription here:

Internal Process Improvements: has automated over 90% of our testing suite to increase our agility and performance for our customers

API Server 6.x to 7.x

Migrating your deployment from a 6.x version into a 7.x version is not complicated, but there are a few things that you should be aware of to ensure that your transition goes smoothly. The 7.x version of the Enterprise Server introduces a number of new features that you will certainly want to take advantage of, which are as follows.

7.x Features



New Licensing System

This allows you to manage all of your licenses with the platform in one single location. It also streamlines how licenses are applied to both the Enterprise Server as well as the PDF Server. In addition to this, both PDF and Enterprise Servers now manage their licenses and configure their licenses in the same way. For more information, see the License Management section of our help docs.

Group Permission Levels

With the 7.x Server release, you can now configure the Group permissions to be categorized into different levels. For example, you can now configure users to be Admins of a group, or members of a group and then assign permissions separately based on their role within the group. For more information, please see the Group Roles section in our user guide.

User Session Management

The 7.x Server release adds additional levels of security around user session management and ensures that any outstanding JWT token can no longer be utilized once a user has logged out of their current "session". This ensures that each JWT token can only be associated with a "current" session and JWT tokens associated with invalid sessions (through logout) can no longer be used. For more information, see our User Session Management user guide.

Audit Logging

The audit logging system allows you to add additional logging capabilities that logs ever action taken by all users within the system. See Audit Logging for more information.

Simplified PDF Setup

In addition to the 7.x Enterprise Server, this release also includes the 3.x Enterprise PDF server, which introduces a number of improvements to the management and configuration of how the PDF server is connected to the API Server.

Isomorphic Validation

The new Isomorphic validation utilizes the core renderer found @ as the mechanism for validating submissions within the server logic. This ensures that any validation that occurs on the front end form, is the exact same validation that occurs within the server validation system.

Before you begin your migration, it is recommended that you first create complete backups of your existing environment. Once this is completed, you can now spin up a replica environment and point this environment to the SAME database as your current 6.x environment. This should not cause any problems because the 7.x upgrade does not introduce any changes to database schemas as well as does not perform any update hooks (as of version 7.0.0). Before this environment is launched, however, you will need to ensure that you change the following environment variables for this cloned deployment.

Enterprise Server Environment Variable Changes

6.x variable

7.x variable




Change name. Same value.


Delete this environment variable


Set value as new license key



Change name. Same value.

PDF Server Environment Variable Changes

2.x variable

3.x variable



Delete this environment variable


Delete this environment variable


Delete this environment variable


Delete this environment variable


Add same value as provided to enterprise server.


Set value as the new license key. Same as API Server

For the PDF Server (3.x version), you will also need to ensure that this container has access to the same database as the API Server. In some cases, this may require you to add a --link to the mongo container. Please read the PDF Server Deployment User Guide for detailed instructions on how to accomplish this.

After your two environments are up and running, you can now perform necessary tests against your replica environment. Once it is determined that all features are working to your expectation, you can then flip a DNS switch over to your replica environment, making it become the new production, while at the same time keeping the old 6.x environment running as a fail safe if anything should happen (where you can then DNS switch back to the old environment).

Fixing Deployed (remote) Projects

In some cases, you may have been using the Hosted Portal (found at to connect to your remote environment, which was running 6.x. Once you upgrade your deployments to the version 7.0.0 or greater it must be known that you can not use any portal version less than version 7.1.0. This version can be found at the footer of the hosted portal. If the hosted portal is not yet on that version, then one option is to enable the deployed portal within your environment. This can be accomplished by following the Enable the Developer Portal user guide.

One other thing that should be noted is that, in some cases, the License will erroneously register your "stage" projects within your deployment as "licensed" projects. This occurs because your projects within your deployed environment do not contain the correct "project" property associating them with the correct licensed project. To fix this, you must do the following.

  • Determine the Project ID of the "main" project that is licensed. This can be found within the License Manager or by clicking on the main project in your hosted portal and taking note of the Project ID within the Url of the browser.

  • Once you have the Project ID, you will need to connect your terminal to your MongoDB instance and perform the following command to set the correct "project" property values on your remote projects.

db.projects.updateMany({}, {$set:{project:ObjectId('LICENSED_PROJECT_ID')}});

Of course, you will replace the LICENSED_PROJECT_ID with the ID of your "main" project. Once you do this, any projects within your new remote environment will register as "stages" within the main project and will not erroneously count against your license.

API Server 5.x to 6.x

The migration process between 5.x server to the 6.x server should be a very seamless process. You will simply need to follow the instructions provided in the Upgrading Deployment sections. There are some cases where you may run into DB update failures. For these cases, you will want to make sure that you follow the instructions provided in the Resolving Update Failures section to mitigate these issues.

PDF Server 1.x-3.x to 5.x

5.x PDF Server requires an upgrade to API Server 8.0.0 or higher.

If you are using PDF Server APIs directly you need to change the way your application authorizes the requests:

  • If you are doing client-to-server requests (your application is working on browser) then you should send requests to API Server's /pdf-proxy endpoint using x-jwt-token header, see PDF API section of API docs.

  • If you are doing server-to-server requests (your application is working on the server and request readers cannot be exposed) then you can send requests directly to PDF Server using admin key authorization. To do this you should add the FORMIO_PDF_ADMINKEY environment variable to your PDF Server run config. Set it to some secret key. After that, you can authorize your requests using x-admin-key header, which should be the same as the environment variable value. See PDF Server direct API section of API docs.

Last updated