Cannot copy image from one host to another

I have finished setting up my OpenNebula cloud on Ubuntu 16.04 LTS deployed by a MaaS machine. All my nodes are able to SSH into each other without issue, and I have been able to create the needed datastores for each host. I then went and downloaded the Ubuntu 16.04 - KVM app from the marketplace and as a test went to make a clone using the Sunstone UI to another host. However it was unable to copy the image.

System details:

  • Opennebula Version: 5.4.6
  • Operating System: Ubuntu 16.04.4 LTS
  • Storage Layout: SSH
  • Able to SSH Between Hosts: YES


The contents of my log regarding the failed transfer:

Fri Jun  1 18:10:32 2018 [Z0][ReM][D]: Req:3424 UID:0 one.image.info invoked , 0
Fri Jun  1 18:10:32 2018 [Z0][ReM][D]: Req:3424 UID:0 one.image.info result SUCCESS, "<IMAGE><ID>0</ID><UI..."
Fri Jun  1 18:10:32 2018 [Z0][ReM][D]: Req:2960 UID:0 one.image.clone invoked , 0, "Ubuntu 16.04 - KVM -...", 116
Fri Jun  1 18:10:32 2018 [Z0][ImM][I]: Cloning image /var/lib/one//datastores/1/4b58d3df9b1432dc7fa494605ac58bc4 to repository as image 1
Fri Jun  1 18:10:32 2018 [Z0][ReM][D]: Req:2960 UID:0 one.image.clone result SUCCESS, 1
Fri Jun  1 18:10:33 2018 [Z0][ImM][I]: Command execution fail: /var/lib/one/remotes/datastore/fs/clone PERTX0RSSVZFUl9BQ1RJT05fREFUQT48SU1BR0U+PElEPjE8L0lEPjxVSUQ+MDwvVUlEPjxHSUQ+MDwvR0lEPjxVTkFNRT5vbmVhZG1pbjwvVU5BTUU+PEdOQU1FPm9uZWFkbWluPC9HTkFNRT48TkFNRT5VYnVudHUgMTYuMDQgLSBLVk0gLSBDYXVzYWw8L05BTUU+PFBFUk1JU1NJT05TPjxPV05FUl9VPjE8L09XTkVSX1U+PE9XTkVSX00+MTwvT1dORVJfTT48T1dORVJfQT4wPC9PV05FUl9BPjxHUk9VUF9VPjA8L0dST1VQX1U+PEdST1VQX00+MDwvR1JPVVBfTT48R1JPVVBfQT4wPC9HUk9VUF9BPjxPVEhFUl9VPjA8L09USEVSX1U+PE9USEVSX00+MDwvT1RIRVJfTT48T1RIRVJfQT4wPC9PVEhFUl9BPjwvUEVSTUlTU0lPTlM+PFRZUEU+MDwvVFlQRT48RElTS19UWVBFPjA8L0RJU0tfVFlQRT48UEVSU0lTVEVOVD4wPC9QRVJTSVNURU5UPjxSRUdUSU1FPjE1Mjc4NzY2MzI8L1JFR1RJTUU+PFNPVVJDRT48IVtDREFUQVtdXT48L1NPVVJDRT48UEFUSD48IVtDREFUQVsvdmFyL2xpYi9vbmUvL2RhdGFzdG9yZXMvMS80YjU4ZDNkZjliMTQzMmRjN2ZhNDk0NjA1YWM1OGJjNF1dPjwvUEFUSD48RlNUWVBFPjwhW0NEQVRBW3Fjb3cyXV0+PC9GU1RZUEU+PFNJWkU+MjI1MjwvU0laRT48U1RBVEU+NDwvU1RBVEU+PFJVTk5JTkdfVk1TPjA8L1JVTk5JTkdfVk1TPjxDTE9OSU5HX09QUz4wPC9DTE9OSU5HX09QUz48Q0xPTklOR19JRD4wPC9DTE9OSU5HX0lEPjxUQVJHRVRfU05BUFNIT1Q+LTE8L1RBUkdFVF9TTkFQU0hPVD48REFUQVNUT1JFX0lEPjExNjwvREFUQVNUT1JFX0lEPjxEQVRBU1RPUkU+Y2FzdWFsLWltYWdlczwvREFUQVNUT1JFPjxWTVM+PC9WTVM+PENMT05FUz48L0NMT05FUz48QVBQX0NMT05FUz48L0FQUF9DTE9ORVM+PFRFTVBMQVRFPjxERVZfUFJFRklYPjwhW0NEQVRBW3ZkXV0+PC9ERVZfUFJFRklYPjxEUklWRVI+PCFbQ0RBVEFbcWNvdzJdXT48L0RSSVZFUj48Rk9STUFUPjwhW0NEQVRBW3Fjb3cyXV0+PC9GT1JNQVQ+PEZST01fQVBQPjwhW0NEQVRBWzZdXT48L0ZST01fQVBQPjxGUk9NX0FQUF9OQU1FPjwhW0NEQVRBW1VidW50dSAxNi4wNCAtIEtWTV1dPjwvRlJPTV9BUFBfTkFNRT48RlNUWVBFPjwhW0NEQVRBW3Fjb3cyXV0+PC9GU1RZUEU+PE1ENT48IVtDREFUQVs1ODgzOGQwODQ0ZGVlM2I1ZWI5MWZjOGE5NWRmMzgyM11dPjwvTUQ1PjwvVEVNUExBVEU+PFNOQVBTSE9UUz48QUxMT1dfT1JQSEFOUz48IVtDREFUQVtOT11dPjwvQUxMT1dfT1JQSEFOUz48L1NOQVBTSE9UUz48L0lNQUdFPjxEQVRBU1RPUkU+PElEPjExNjwvSUQ+PFVJRD4wPC9VSUQ+PEdJRD4wPC9HSUQ+PFVOQU1FPm9uZWFkbWluPC9VTkFNRT48R05BTUU+b25lYWRtaW48L0dOQU1FPjxOQU1FPmNhc3VhbC1pbWFnZXM8L05BTUU+PFBFUk1JU1NJT05TPjxPV05FUl9VPjE8L09XTkVSX1U+PE9XTkVSX00+MTwvT1dORVJfTT48T1dORVJfQT4wPC9PV05FUl9BPjxHUk9VUF9VPjE8L0dST1VQX1U+PEdST1VQX00+MDwvR1JPVVBfTT48R1JPVVBfQT4wPC9HUk9VUF9BPjxPVEhFUl9VPjA8L09USEVSX1U+PE9USEVSX00+MDwvT1RIRVJfTT48T1RIRVJfQT4wPC9PVEhFUl9BPjwvUEVSTUlTU0lPTlM+PERTX01BRD48IVtDREFUQVtmc11dPjwvRFNfTUFEPjxUTV9NQUQ+PCFbQ0RBVEFbc3NoXV0+PC9UTV9NQUQ+PEJBU0VfUEFUSD48IVtDREFUQVsvdmFyL2xpYi9vbmUvL2RhdGFzdG9yZXMvMTE2XV0+PC9CQVNFX1BBVEg+PFRZUEU+MDwvVFlQRT48RElTS19UWVBFPjA8L0RJU0tfVFlQRT48U1RBVEU+MDwvU1RBVEU+PENMVVNURVJTPjxJRD4wPC9JRD48L0NMVVNURVJTPjxUT1RBTF9NQj4xODc3NDEzPC9UT1RBTF9NQj48RlJFRV9NQj4xNzcyMTA0PC9GUkVFX01CPjxVU0VEX01CPjk5MjA8L1VTRURfTUI+PElNQUdFUz48L0lNQUdFUz48VEVNUExBVEU+PEFMTE9XX09SUEhBTlM+PCFbQ0RBVEFbTk9dXT48L0FMTE9XX09SUEhBTlM+PEJSSURHRV9MSVNUPjwhW0NEQVRBW2NhdXNhbC1jYXQubWNyZXNvbHV0aW9uLm9yZ11dPjwvQlJJREdFX0xJU1Q+PENMT05FX1RBUkdFVD48IVtDREFUQVtTWVNURU1dXT48L0NMT05FX1RBUkdFVD48RElTS19UWVBFPjwhW0NEQVRBW0ZJTEVdXT48L0RJU0tfVFlQRT48RFNfTUFEPjwhW0NEQVRBW2ZzXV0+PC9EU19NQUQ+PExOX1RBUkdFVD48IVtDREFUQVtTWVNURU1dXT48L0xOX1RBUkdFVD48UkVTVFJJQ1RFRF9ESVJTPjwhW0NEQVRBWy9dXT48L1JFU1RSSUNURURfRElSUz48U0FGRV9ESVJTPjwhW0NEQVRBWy92YXIvdGVtcF1dPjwvU0FGRV9ESVJTPjxUTV9NQUQ+PCFbQ0RBVEFbc3NoXV0+PC9UTV9NQUQ+PFRZUEU+PCFbQ0RBVEFbSU1BR0VfRFNdXT48L1RZUEU+PC9URU1QTEFURT48L0RBVEFTVE9SRT48L0RTX0RSSVZFUl9BQ1RJT05fREFUQT4= 1
Fri Jun  1 18:10:33 2018 [Z0][ImM][I]: clone: Copying remotely local image /var/lib/one//datastores/1/4b58d3df9b1432dc7fa494605ac58bc4 to the image repository
Fri Jun  1 18:10:33 2018 [Z0][ImM][E]: clone: Command "mkdir -p /var/lib/one//datastores/116; cp -f /var/lib/one//datastores/1/4b58d3df9b1432dc7fa494605ac58bc4 /var/lib/one//datastores/116/5683b7cf8779a2ac8ccae8d0cec17fc8" failed: cp: cannot stat '/var/lib/one//datastores/1/4b58d3df9b1432dc7fa494605ac58bc4': No such file or directory
Fri Jun  1 18:10:33 2018 [Z0][ImM][E]: Error copying /var/lib/one//datastores/1/4b58d3df9b1432dc7fa494605ac58bc4 to /var/lib/one//datastores/116/5683b7cf8779a2ac8ccae8d0cec17fc8 in causal-cat.mcresolution.org
Fri Jun  1 18:10:33 2018 [Z0][ImM][I]: ExitCode: 1
Fri Jun  1 18:10:33 2018 [Z0][ImM][E]: Error cloning from Image 0: Error copying /var/lib/one//datastores/1/4b58d3df9b1432dc7fa494605ac58bc4 to /var/lib/one//datastores/116/5683b7cf8779a2ac8ccae8d0cec17fc8 in causal-cat.mcresolution.org
Fri Jun  1 18:10:33 2018 [Z0][InM][D]: Monitoring datastore casual-images (116)
Fri Jun  1 18:10:33 2018 [Z0][ImM][D]: Datastore casual-images (116) successfully monitored.


However if I manually go to the location /var/lib/one/datastores/1 and run the ls command, it shows that the image is indeed there:

oneadmin@picked-bass:/var/lib/one/datastores/1# cd /var/lib/one/datastores/1/
oneadmin@picked-bass:/var/lib/one/datastores/1# ls
4b58d3df9b1432dc7fa494605ac58bc4
oneadmin@picked-bass:/var/lib/one/datastores/1#


As can be seen from the following portion of the log, all the datastores and hosts are successfully communicating with the oned:

Fri Jun  1 18:11:30 2018 [Z0][MKP][D]: Monitoring marketplace OpenNebula Public (0)
Fri Jun  1 18:11:30 2018 [Z0][InM][D]: Monitoring datastore images (1)
Fri Jun  1 18:11:30 2018 [Z0][InM][D]: Monitoring datastore files (2)
Fri Jun  1 18:11:30 2018 [Z0][InM][D]: Monitoring datastore exotic-files (107)
Fri Jun  1 18:11:30 2018 [Z0][InM][D]: Monitoring datastore exotic-images (108)
Fri Jun  1 18:11:30 2018 [Z0][InM][D]: Monitoring datastore crisp-images (110)
Fri Jun  1 18:11:30 2018 [Z0][InM][D]: Monitoring datastore casual-images (116)
Fri Jun  1 18:11:30 2018 [Z0][InM][D]: Monitoring datastore causal-files (119)
Fri Jun  1 18:11:30 2018 [Z0][InM][D]: Monitoring datastore crisp-files (120)
Fri Jun  1 18:11:30 2018 [Z0][ImM][D]: Datastore images (1) successfully monitored.
Fri Jun  1 18:11:30 2018 [Z0][ImM][D]: Datastore files (2) successfully monitored.
Fri Jun  1 18:11:30 2018 [Z0][ImM][D]: Datastore exotic-files (107) successfully monitored.
Fri Jun  1 18:11:30 2018 [Z0][ImM][D]: Datastore exotic-images (108) successfully monitored.
Fri Jun  1 18:11:30 2018 [Z0][MKP][D]: Marketplace OpenNebula Public (0) successfully monitored.
Fri Jun  1 18:11:30 2018 [Z0][ImM][D]: Datastore crisp-images (110) successfully monitored.
Fri Jun  1 18:11:30 2018 [Z0][ImM][D]: Datastore crisp-files (120) successfully monitored.
Fri Jun  1 18:11:30 2018 [Z0][ImM][D]: Datastore causal-files (119) successfully monitored.
Fri Jun  1 18:11:30 2018 [Z0][ImM][D]: Datastore casual-images (116) successfully monitored.
Fri Jun  1 18:11:39 2018 [Z0][InM][D]: Host exotic-walrus.mcresolution.org (3) successfully monitored.
Fri Jun  1 18:11:39 2018 [Z0][InM][D]: Host causal-cat.mcresolution.org (4) successfully monitored.
Fri Jun  1 18:11:41 2018 [Z0][InM][D]: Host crisp-stud.mcresolution.org (5) successfully monitored.
Fri Jun  1 18:12:00 2018 [Z0][InM][D]: Host exotic-walrus.mcresolution.org (3) successfully monitored.
Fri Jun  1 18:12:00 2018 [Z0][InM][D]: Host causal-cat.mcresolution.org (4) successfully monitored.
Fri Jun  1 18:12:01 2018 [Z0][InM][D]: Host crisp-stud.mcresolution.org (5) successfully monitored.
Fri Jun  1 18:12:20 2018 [Z0][InM][D]: Host exotic-walrus.mcresolution.org (3) successfully monitored.
Fri Jun  1 18:12:21 2018 [Z0][InM][D]: Host causal-cat.mcresolution.org (4) successfully monitored.


And finally, you can see that by running the onedatastore list command as the oneadmin user all my datastores show online and configured using ssh:

  ID NAME                SIZE AVAIL CLUSTERS     IMAGES TYPE DS      TM      STAT
   0 system                 - -     0                 0 sys  -       ssh     on
   1 images            228.6G 90%   0                 1 img  fs      ssh     on
   2 files             228.6G 90%   0                 0 fil  fs      ssh     on
 105 exotic-system          - -     0                 0 sys  -       ssh     on
 107 exotic-files      228.6G 90%   0                 0 fil  fs      ssh     on
 108 exotic-images     228.6G 90%   0                 0 img  fs      ssh     on
 109 crisp-system           - -     0                 0 sys  -       ssh     on
 110 crisp-images        1.8T 94%   0                 0 img  fs      ssh     on
 115 causal-system          - -     0                 0 sys  -       ssh     on
 116 casual-images       1.8T 94%   0                 1 img  fs      ssh     on
 119 causal-files        1.8T 94%   0                 0 fil  fs      ssh     on
 120 crisp-files         1.8T 94%   0                 0 fil  fs      ssh     on


Does anyone have any ideas on how to fix this issue?

Hi,

If I am not wrong it is not possible to copy an image from one datastore to another using ssh when fs DATASTORE_MAD is used.

I’d suggest to mount the “remote” datastore locations using a shared filesystem(NFS, Gluster, etc). As the datastores are “remote” to the front-end it looks like your BRIDGE_LIST variable is properly set. It should contain only the hostname or ip of the host where’s the datastore home.

For example is each host exports its local IMAGE datastore via NFS to the other nodes. And has the rest of the IMAGE datastore mounted locally.

So following your logs the host named causal should have:

/var/lib/one/datastores/116 --> exported via NFS to the other hosts
/var/lib/datastores/1 <-- mounted via NFS (I’d guiess that this datastore is on the front-end?)
/var/lib/one/datastores/108 <-- mounted via NFS from exotic
/var/lib/one/datastores/110 <-- mounted via NFS from crisp

Regarding the FILE datastore, it is used mostly for content included in the contextualization ISO that is prepared on the front-end host so I think it is not needed to have it on each host. If it’s needed you should arrange rsync or similar to copy the data though.

Hope this helps,

Best Regards,
Anton Todorov