Persistent vm creation from templates does not work

Problem description :
Persistent vm creation from templates does not work. VM freezes in “CLONNING” state.
Everything else is working without any problems. (non persistent vm create from templates, apps download from marketplaces, templates and images e.t.c. )

Configuration :
Opennebula 6.0
2 frontends, 3 compute nodes, fc storage with shared luns. Filesystem ocfs2.

Disks are mounted correctly to the necessary folders at frontend and compute nodes:
/var/lib/one//datastores/118/
/var/lib/one//datastores/116/

Datastores :
116 SYSTEM
Attributes
ALLOW_ORPHANS NO
DISK_TYPE FILE
DS_MIGRATE YES
RESTRICTED_DIRS /
SAFE_DIRS /var/tmp
SHARED YES
TM_MAD shared
TYPE SYSTEM_DS

118 
Attributes:
ALLOW_ORPHANS	NO	
CLONE_TARGET	SYSTEM	
CLONE_TARGET_SSH	SYSTEM	
DISK_TYPE	FILE	
DISK_TYPE_SSH	FILE	
DS_MAD	fs	
LN_TARGET	NONE	
LN_TARGET_SSH	SYSTEM	
RESTRICTED_DIRS	/	
SAFE_DIRS	/var/tmp	
TM_MAD	shared	
TM_MAD_SYSTEM	ssh	
TYPE	IMAGE_DS

oned.log

Tue Apr 27 13:30:27 2021 [Z0][ReM][D]: Req:8480 UID:0 IP:127.0.0.1 one.zone.raftstatus invoked
Tue Apr 27 13:30:27 2021 [Z0][ReM][D]: Req:8480 UID:0 one.zone.raftstatus result SUCCESS, “<SERVER_ID>-1<…”
Tue Apr 27 13:30:27 2021 [Z0][ReM][D]: Req:5664 UID:0 IP:127.0.0.1 one.vmpool.infoextended invoked , -2, -1, -1, -1
Tue Apr 27 13:30:27 2021 [Z0][ReM][D]: Req:5664 UID:0 one.vmpool.infoextended result SUCCESS, “<VM_POOL>332…”
Tue Apr 27 13:30:27 2021 [Z0][ReM][D]: Req:8352 UID:0 IP:127.0.0.1 one.vmpool.infoextended invoked , -2, -1, -1, -1
Tue Apr 27 13:30:27 2021 [Z0][ReM][D]: Req:8352 UID:0 one.vmpool.infoextended result SUCCESS, “<VM_POOL>332…”
Tue Apr 27 13:30:31 2021 [Z0][DBM][I]: Purging obsolete LogDB records: 0 records purged. Log state: 0,0 - 0,0
Tue Apr 27 13:30:31 2021 [Z0][DBM][I]: Purging obsolete federated LogDB records: 0 records purged. Federated log size: 0.
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:7264 UID:0 IP:127.0.0.1 one.template.info invoked , 793, false, false
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:7264 UID:0 one.template.info result SUCCESS, “793<…”
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:5104 UID:0 IP:127.0.0.1 one.template.instantiate invoked , 793, “debian-vm”, false, “SCHED_REQUIREMENTS=”…", true
Tue Apr 27 13:30:33 2021 [Z0][ImM][I]: Cloning image /var/lib/one//datastores/118/1b43f5dd3224c9724390fc7aeac310d2 to repository as image 310
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:5104 UID:0 one.template.instantiate result SUCCESS, 333
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:7008 UID:0 IP:127.0.0.1 one.templatepool.info invoked , -2, -1, -1
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:7008 UID:0 one.templatepool.info result SUCCESS, “<VMTEMPLATE_POOL><VM…”
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:8496 UID:0 IP:127.0.0.1 one.template.info invoked , 793, false, false
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:8496 UID:0 one.template.info result SUCCESS, “793<…”
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:3024 UID:0 IP:127.0.0.1 one.user.info invoked , 0, false
Tue Apr 27 13:30:33 2021 [Z0][ReM][D]: Req:3024 UID:0 one.user.info result SUCCESS, “0<GID…”
Tue Apr 27 13:30:40 2021 [Z0][ImM][I]: Image cloned and ready to use.
Tue Apr 27 13:30:40 2021 [Z0][InM][D]: Monitoring datastore main-images (118)
Tue Apr 27 13:30:40 2021 [Z0][ImM][D]: Datastore main-images (118) successfully monitored.
Tue Apr 27 13:30:42 2021 [Z0][ReM][D]: Req:5648 UID:0 IP:127.0.0.1 one.zone.raftstatus invoked
Tue Apr 27 13:30:42 2021 [Z0][ReM][D]: Req:5648 UID:0 one.zone.raftstatus result SUCCESS, “<SERVER_ID>-1<…”
Tue Apr 27 13:30:42 2021 [Z0][ReM][D]: Req:8016 UID:0 IP:127.0.0.1 one.vmpool.infoextended invoked , -2, -1, -1, -1
Tue Apr 27 13:30:42 2021 [Z0][ReM][D]: Req:8016 UID:0 one.vmpool.infoextended result SUCCESS, “<VM_POOL>333…”
Tue Apr 27 13:30:42 2021 [Z0][ReM][D]: Req:7696 UID:0 IP:127.0.0.1 one.vmpool.infoextended invoked , -2, -1, -1, -1
Tue Apr 27 13:30:42 2021 [Z0][ReM][D]: Req:7696 UID:0 one.vmpool.infoextended result SUCCESS, “<VM_POOL>333…”
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:2944 UID:24 IP:127.0.0.1 one.datastorepool.info invoked
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:2944 UID:24 one.datastorepool.info result SUCCESS, “<DATASTORE_POOL><DAT…”
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:7712 UID:24 IP:127.0.0.1 one.hostpool.info invoked
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:7712 UID:24 one.hostpool.info result SUCCESS, “<HOST_POOL><ID…”
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:8784 UID:0 IP:127.0.0.1 one.vmpool.info invoked , -2, 0, -200, -1
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:8784 UID:0 one.vmpool.info result SUCCESS, “<VM_POOL>333…”
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:8096 UID:0 IP:127.0.0.1 one.grouppool.info invoked
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:2560 UID:0 IP:127.0.0.1 one.imagepool.info invoked , -2, 0, -200, -1
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:3648 UID:0 IP:127.0.0.1 one.vnpool.info invoked , -2, -1, -1
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:8096 UID:0 one.grouppool.info result SUCCESS, “<GROUP_POOL><…”
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:2560 UID:0 one.imagepool.info result SUCCESS, “<IMAGE_POOL><…”
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:3648 UID:0 one.vnpool.info result SUCCESS, “<VNET_POOL><ID…”
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:8400 UID:0 IP:127.0.0.1 one.clusterpool.info invoked
Tue Apr 27 13:30:46 2021 [Z0][ReM][D]: Req:8400 UID:0 one.clusterpool.info result SUCCESS, “<CLUSTER_POOL><CLUST…”
Tue Apr 27 13:30:47 2021 [Z0][ReM][D]: Req:3952 UID:0 IP:127.0.0.1 one.user.info invoked , 0, false
Tue Apr 27 13:30:47 2021 [Z0][ReM][D]: Req:3952 UID:0 one.user.info result SUCCESS, “0<GID…”
Tue Apr 27 13:30:48 2021 [Z0][ReM][D]: Req:7216 UID:24 IP:127.0.0.1 one.vnpool.info invoked , -2, -1, -1
Tue Apr 27 13:30:48 2021 [Z0][ReM][D]: Req:7216 UID:24 one.vnpool.info result SUCCESS, “<VNET_POOL><ID…”

333.log
Tue Apr 27 13:30:33 2021 [Z0][VM][I]: New state is CLONING
Tue Apr 27 13:30:40 2021 [Z0][VM][I]: New state is CLONING

We are releasing a maintenance version for 6.0.0 that may address this issue. We believe you may be impacted by this issue:

Upgrade to 6.0.0.1-1 fixed this issue.
Thank you.