PlateSpin Orchestrate: Configure VMotion (Live Migration) of a Virtual Machine in VCenter 2.x



By: battala

May 15, 2009 11:08 am

Reads: 354

Comments:0

Rating:0

Visualization plays a vital role in the Data Center. Most of the Enterprises are moving towards visualization for server consolidation.

For the first time when I was trying to configure VMotion (Live Migration of a VM) for Platespin Orchestrate I couldn’t find a good resource on how to configure it. It will be nice for future reference if anybody tries to configure Live Migration this article could help them to do their task easily.

Live Migration of a Virtual Machine:

  • Moving a Virtual Machine from one physical server to another physical server without shutting down the Virtual Machine and without any impact to the services provided by the Virtual Machine is called Live Migration.
  • XEN, VMWare, and Hyper-V Windows Server 2008 R2 hypervisors supports Live Migration of a VM.
  • Each vendor uses a different way of migrating the Virtual Machine. VMWare calls this feature VMotion and XEN as Live Migration.

VMotion is the technology used by VMWare to move the running VM from one physical server to another physical server without shutting down or suspending the VM.

The steps to configure a VM to be eligible for Live Migration are:

  1. Install VCenter2.x
  2. Inventory at least 2 ESX servers (source and target).
  3. Both should be on the same network.
  4. Create a shared storage (iSCSI , NAS, FC-SAN) between the source and the target machine.
  5. Keep the Virtual Machine to be migrated into the shared storage.
  6. Provision the Virtual Machine.
  7. Right click on the VM and select “Move”.
  8. Select the destination Host as another ESX Server (target server) where the shared storage is attached.
  9. You can see the VM get migrated to the other host in seconds.
  10. VM will be running on the target server.

1. Installing the VCenter 2.x:

Download VCenter 2.x at: http://www.vmware.com/download/download.do?downloadGroup=VC250U4. If a license is not available, it will work for only 60 days as a trial version.

2. Install the ESX Servers (3.x version).

3. Create the shared storage between the two ESX Servers (if FC-SAN is not available, better create a iSCSI storage instead of NAS).

Create any iSCSI LUN from any Linux machine (SUSE Linux has some good tools for iSCSI target and initiator). Create any partition in the Linux machine as iSCSI target. Test whether the target is working properly or not with any initiator.

Here is a good cool solution how to create the iSCSI target: (Setting Up iSCSI Using YaST)

Now, it’s the task to access the iSCSI storage in the ESX server (both source and target machines).

How to add iSCSI LUN as the storage in the ESX Server when managing from VCenter.

NOTE: The ESX server comes with in built iSCSI Client (iSCSI software client).

First, enable the service of the iSCSI client. We can enable it either from the command line or from the VI Client

  1. Command:
    esxcfg-firewall -e swISCSIClient

    This command will allow the iSCSI client service through the firewall.

  2. How to enable via the VI Client:
    1. Select the Host and click on the Configuration tab.
    2. Select Security Profile.
    3. Click on Properties (top right).

In the above wizard, select the iSCSI Client service.

Now after enabling the services we need to attach the shared storage to the ESX Server.

To attach the iSCSI disk, we need to add a connection. Type VMKernel to the Esx Server.

NOTE: If you add VMKernel then only you will be able to configure the iSCSI storage.

How to add the VMKernel to the Esx Server:

Select the Host, and go to the Configuration tab.

Select Networking.

In the top right of the screen, the “Add Networking” option will be available.

Click on the “Add Networking” option, the “Add Network Wizard” screen is shown to add the VMKernel.

Provide the necessary details correctly (ip address, Subnet and Default Gateway).

Once all the necessary details are provided, in the graph it will show which NIC card the traffic will be going on.

Click on the created VMKernel Network connection type and edit it. We will get a wizard of VMKernel properties.

In the General tab,

NOTE: Enable the VMotion Option for the Live Migration to work.

Make sure the connections are proper

Now click on the Configuration tab, then Storage Adapters.

Select the iSCSI Software Adapter.

Click on Properties.

Click on Configure and Enable the Initiator properties.

Go to the Dynamic Discovery tab and add the IP of the iSCSI Target that we have created.

The default port is 3260 for the iSCSI transport.

After selecting “OK”, a Job will be fired to discover the iSCSI targets from the IP we have provided. It will discover all of the iSCSI LUNs, and it will show the properties of the Discovered LUNs. See in the above screenshot how the discovered LUNs will be shown.

Now the LUNs are raw disks. We need to format them into VMFS file system.

Click on the Configuration tab, then Storage.

Click on the option “Add Storage”.

A wizard will guide you to add the Discovered LUNs as storage to the ESX Server.

Do the above process (Enabling VMotion, Enabling and adding the same storage LUN to the Target ESX Server).

Now create a Virtual Machine on the Shared Storage which is connected to both the Source and the Target Server.

Click on the VM and select the Maps tab to view the location of the VM.

The above map shows the location of the VM and which network it is connected to and currently running on which Host. Both of the Hosts are on the same Network.

Now that we are all set with the configuration, VM is ready to Live Migrate.

Now provision the VM. Make sure it’s in running state.

Right click on the VM then click Move.

A Move Operation Wizard will guide you to provide which host the VM has to be moved to.

Select the target server and click OK.

If the VM is not qualified for Live Migration, it will show Warnings or Errors why it can’t be moved to the target server.

Please refer the below links for suggestions and support matrix of CPU configurations by VMWare.

For the CPU support matrix: http://kb.vmware.com/selfservice/microsites/search.do? language=en_US&cmd=displayKC&externalId=1991

VWMare suggestions and requirements for VMotion.
http://www.vmware.com/support/vc13/doc/c2vmotionreqs12.html

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Tags: , ,
Categories: PlateSpin, PlateSpin Orchestrate, Technical Solutions

Disclaimer: As with everything else at NetIQ Cool Solutions, this content is definitely not supported by NetIQ, so Customer Support will not be able to help you if it has any adverse effect on your environment.  It just worked for at least one person, and perhaps it will be useful for you too.  Be sure to test in a non-production environment.

Comment