I'm trying to set up MySQL Cluster with 2 DB nodes, where each DB node is on a
Management server is on computer 1.
DB node 2 is on computer 2.
DB node 3 is on computer 3.
Computer 1 and 3 are the same machine.
I've edited the config.ini on computer 1 and changed the definition of COMPUTER
2 to the hostname of computer 2. I've edited the Ndb.cfg on computer 2 to
reference the management server on computer 1.
Step 2.10 on page 20 of the MySQL Cluster Administrator Guide seems to say
that's all that's needed.
What I've seen:
Management server and DB node 3 start up and communicate, and apparently DB
node 2 sees the management server (looking at output from ndbd), but the
management server doesn't see DB node 2 (looking at NBD>2 status).
I am able to get MySQL Cluster running both computer 1 and computer 2
independently. In this case I am using ndb/ndbcluster.sh --small &
Some other pertinent info:
As a diagnostic, I tried to telnet from computer 1 to computer 2 on port 2202.
Connection refused. But a telnet to localhost on port 2202 on computer 2 is
accepted. I have ports 2200, 2201, 2202, 2203, 2204 open for TCP in iptables on
Any help would be appreciated. Thanks.
Do you Yahoo!?
New and Improved Yahoo! Mail - 100MB free storage!