Http Marketplace on OpenNebula 5.0.2

I installed and configured an HTTP Marketplace for my ON 5.0.2.
I would like to have two http servers.
I would like to create Apps and have image files saved in both http servers. Is that possible?
If I add the server ip in BRIDGE_LIST attribute, is the right way to proceed?

Thanks in advance.

If I understand the question, I think you need to define two Marketplaces:

NAME = market1
BASE_URL = "http://<url_market_1>/"
PUBLIC_DIR = "/path/to/dir/market_1" # <- this is where files will be written to
BRIDGE_LIST = "server_market_1" # <- this is the server where the files are being written to

And a similar one for market2.

if BRIDGE_LIST is empty, file list will be stored in PUBLIC_DIR in the frontend, and they must be accessible via http in the address that contains BASE_URL. If BRIDGE_LIST is defined, it will store them in that server, instead of in the frontend.

Sorry I didn’t explain well.
I need to create one Marketplace and be able to save app files in two different servers.

My configuration would be:
NAME = market1
BASE_URL = "http://<url_market_1>/"
PUBLIC_DIR = "/path/to/dir/market_1"
BRIDGE_LIST = “server1_market_1, server2_market_1”

Both server has the same PUBLIC_DIR and are accessible by http.

Is it possible?

OK, now I get it. Unfortunately that is not supported. The script will get one of the hosts in BRIDGE_LIST randomly and copy the file only to that server. If you want to change the behaviour it can be done with a little bit of hacking.

I can think of two options:

  1. Change so that instead of getting one server randomly from BRIDGE_LIST it iterates through them and performs the exec_and_log for each one of the servers.

  2. List only one server in BRIDGE_LIST and distribute the files asynchronously with an rsync that is triggered using inotify. This should be done outside of OpenNebula. I believe you want to do this because you have a load balancer in front of the nodes, so you must take into account that OpenNebula will finish the register operation before the files are properly distributed to all the hosts. So doing a GET may fail while the distribution process is finished. I think I would go for option 1.


Thanks very much for the explanation, this is very helpful.