VMware ESXi vMotion using directly connected card without switch

Hi guys,

I was trying to test a server that has 2x 1GB and 2x 10GB ports so I can move VMs from one host to the other using vMotion (no shared storage). The problem is, I don't have a 10GB switch so I've beem managing the servers through the 1GB ports connect to a switch, and the 10GB ports are connected directly from one server into the other, no switches in between.

On the 10GBIT cards, I set up IP address and GW (which can't really go anywhere as they are connected directly to each other) and I manage the servers using vCenter, which reach these servers through the switch using the 1GB ports. I enabled Vmotion on the 10GBIT ports, and tried to vMotion. It got stuck.

If I do the same on the 1GBIT ports, enabling vMotion on them, it works fine. I don't know if it is because the vMotion can't reach a gateway, but as everything is in the same subnet it should need a GW.

I heard that vMotion is actually a 2 item process:

1.

the cold migration will copy the VMDK files and will use the normal virtual network, not the vMotion network

2.

vMotion is actually just the memory state that will be copied over the network to the machine located on the new host, no VMDK files are involved in this step
Is this right?

Even though, I should be able to use the directly connected networks to migrate VMDK of vMotion the VMs to a different host. Why am I encountering this problem just when connecting the NICs directly to each other?

I was planning to get some infiniband cards as they are a cheap way to get 10-40GB without a switch, but if this is a limitation and I can't really internet the hosts properly and vMotion, it may not make much sense.

Thanks!
AlexAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Andrew Hancock (VMware vExpert / EE MVE^2)VMware and Virtualization ConsultantCommented:
Your assumptions are correct.

and you can connect network ports between servers, providing you get an uplink, there is no need to use a switch.

Test your vMotion network (VMKernel Portgroups), using Ping, good communication is required between vMotion Portgroups for a vMotion to complete.

Have you tested the vMotion network?
0
AlexAuthor Commented:
That is why I could not understand why it didn't work.

10GB NICs have vmotion enabled, IP address and physical link.

I'm going to test again.

Also, I was wondering if I can make the normal traffic go between the hosts through the 10GB NICs, but still be managed on the 1GB NICs?

I would like to test moving VMs from one host to the other with the 10GB cards and no switch for example. I believe it can be done as people were using infiniband cards without switches.
0
Andrew Hancock (VMware vExpert / EE MVE^2)VMware and Virtualization ConsultantCommented:
You can select which interface you want to use for Management, and you can select which network interface you want to use for vMotion.

Remember vMotion network interfaces can have different interfaces, IP Addresses, and is usually recommended to be different networks.
0
Powerful Yet Easy-to-Use Network Monitoring

Identify excessive bandwidth utilization or unexpected application traffic with SolarWinds Bandwidth Analyzer Pack.

AlexAuthor Commented:
Can I also have multiple nics for management in a way I can force the traffic for example of moving VM to different host to got through the 10GB cards? Remember that I can't connect to these cards to manage remotely, they are connected directly to each host without a switch.

What I understand is that the management network is the network that also carries the traffic when VMs are being moved across hosts, so I'm wondering how I can achieve this.
0
Andrew Hancock (VMware vExpert / EE MVE^2)VMware and Virtualization ConsultantCommented:
VMotion of VMs, traffic goes through a vMotion Port.

Management Port is connected to vCenter Server.

Setup a new vMotion network on your 10GBe cards, with different IP Address, and ONLY flag these VMKernel Portgroups as vMotion.
0
AlexAuthor Commented:
This doesn't accomplish what I'm trying to do. I want to move VMDK disks over the 10GB cards too, not just the memory state through vMotion.

Can the move of VMDKs happen on a network that is not connected to vcenter?
0
Andrew Hancock (VMware vExpert / EE MVE^2)VMware and Virtualization ConsultantCommented:
Okay, I understand Cold Migration and Hot Migration (vMotion).

NO, vCenter Server manages the Migration, so vCenter Server must have access to the servers, to perform the migration.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
VMware

From novice to tech pro — start learning today.