Trouble importing BPM Proc Task JSON

I like to migrate data between environments by using the JSON export and import functionality. This includes the BPM content. I am having trouble importing BPM Proc Tasks.

I attempt to export and then import entities in the following order:

  1. ProcAttachmentType
  2. ProcModel (after import I deploy each one before continuing)
  3. ProcDefinition
  4. ProcInstance
  5. ProcRole
  6. ProcActor
  7. StencilSet
  8. ProcTask
  9. ProcAttachment

Things fail when I get to importing ProcTask with the following error:

java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
    at com.haulmont.chile.core.model.utils.MethodsCache.invokeGetter(
    at com.haulmont.chile.core.model.impl.AbstractInstance.getValue(
    at com.haulmont.cuba.core.entity.BaseGenericIdEntity.getValue(

Caused by: org.activiti.engine.ActivitiObjectNotFoundException: no deployed process definition found with id 'contractApproval:1:30'
    at org.activiti.engine.impl.persistence.deploy.DeploymentManager.findDeployedProcessDefinitionById(
    at org.activiti.engine.impl.persistence.deploy.DeploymentManager.getBpmnModelById(
    at org.activiti.engine.impl.cmd.GetBpmnModelCmd.execute(

The strange thing is that when I look at Process Definition in Entity Inspector, I see the Definition with the Activiti process id of ‘contractApproval:1:30’.

If use the Entity Inspector to import the data, instead of calling EntityImportExportService programmatically, the data does make it to the database table, but the data is corrupted. Searching for Proc Tasks in Entity Inspector causes the same error.

Should I be able to do what I am trying? I tried looking at the code but I am having a hard time understanding the Deployment Cache.


I don’t think it will work this way. It seems that when you import tasks using the entityManager it tries to access @MetaProperty methods, e.g. this one:

public String getLocOutcome() {

which in turn invokes Activiti services to read the information from the model.

In addition to database tables created by BPM addon, there is a lot of tables created by the Activiti engine (ACT_RE_PROCDEF, ACT_RU_EXECUTION, etc.). When you deploy a process, a record in the ACT_RE_PROCDEF will be created and it will have a new ID, but the ProcDefinition entity you are importing has a reference to process definition id which is present in the old system only. The same is for process instances and tasks.

The question is what task are you solving by this? Why do you move data between two environments and won’t it be easier just to copy database tables content using DBMS instruments?

Hi Max, thanks for the insight. I see the challenge now.

You pose a good question. Perhaps I am not experienced enough with DBMS instruments. These are the advantages I perceive of JSON data loads vs. SQL.

  • It’s independent of the database. If I switch databases, I don’t have to change the code.
  • It performs upserts. I can update or insert into an existing database with the exported data.
  • I oversimplified my use case a little. I don’t just use the feature for database migration. I also use it for keeping a library of different data sets to be loaded: seed data, demo data, test data, client data, etc.
  • I can build administration screens into my application for exporting or importing data. Unlike the Entity Inspector, my screens do this in bulk: export all data, load seed data, load demo data, load test data, etc. This is not a critical requirement but it’s nice so that an administrator does not need to learn another tool.
  • I can create this logic without much coding because the entities are self aware. So one block of code can service almost all entities. A little extra effort is needed when importing entities with many to many relationships but otherwise, it’s the same. With SQL, I think I need different code for each table.
  • I can export/import “complex” entities. For example, I can export Airports and have those objects include the Terminals and the Gates. I don’t have to worry about the individual tables.

Can you recommend any tools that can help facilitate these activities?

For your cases JSON import/export provided by the platform seems to be a good choice for working with platform entities. Just in case you missed it, you can configure how complex graphs should be imported using the EntityImportView. A short description how to use it is in class javadoc, also see how roles and groups are imported in and