Configure Nutanix Xi Frame Streaming Gateway Appliance for AHV VDI Remote Connectivity

XiFrameAHVRemote

Part 1: Nutanix Xi Frame On-Prem AHV: What Does it Mean to Citrix, VMware and The EUC Community !

Part 2: Deploy & Configure Xi Frame VDI on On-Premises AHV

Part 3: Enhance Nutanix Xi Frame User Experience Domain Settings Optimization On-premises AHV

Part 4: Configure Nutanix Xi Frame Enterprise Profiles For VDI On AHV

Part 5: Configure Nutanix Xi Frame Streaming Gateway Appliance for AHV VDI

Introduction

In order to provide virtual apps and desktops access to remote users that are not within the routed network of the deployed AHV resources, a Nutanix Xi Frame Streaming Gateway Appliance (SGA) needs to be deployed and configured. This appliance will act as an access gateway to broker remote connection requests to the internal network for workloads hosted on AHV on-premises.

Requirements

As of now, only one appliance is supported per account, but I will be testing load balancing for the same very soon. In production environments, this appliance would be deployed in the DMZ. The following are requirements and prerequisites for deploying the Xi Frame SGA:

  • Download the Nutanix Xi Frame SGA appliance using this Nutanix support provided LINK.
  • Wildcard DNS entry “ *.domain.com ” configured on your public DNS provider for a public domain of your choosing pointing to a public IP in your environment.
  • Public IP used that is being pointed to by the wildcard DNS entry of your public domain needs to NAT TCP 443 to the internal IP of the Xi Frame SGA appliance.
  • The same public domain used for the wildcard DNS entry will also require a publicly trusted wildcard SSL certificate.
  • The internal SGA appliance will require internet access to Frame service for TCP 443 and possibly other ports that are not yet listed by Frame documentation.
  • The virtual workloads will require internet access to Frame service utilizing port 8112 to the following public IP’s: 35.173.64.151, 18.232.236.200, 18.214.119.1 . The virtual workloads will also required network access to the SGA appliance on TCP 443 and 8112 ( other ports might also be required ).
  • One SGA currently per account with sizing 2 vCPU and 4GB RAM. No disk required, the YAML script configured the attached ISO.
  • Configuration is pushed to the SGA using a YAML script that has all the config variables such as certificates and CIDR network that is being used by the virtual workloads. Download the YAML file here and just change the certificates, public domain, and CIDR range.
  • After configuration, a support case is required for Xi Frame personnel to activate the remote connectivity service from their side as well since they host the control plane. It took two interventions to make it work so reference support case #00104769 for the same.
  • Nutanix Xi Frame SGA appliance needs to be deployed in UTC timing.

Configuration

Lets start by downloading and importing the SGA appliance on the AHV cluster and finalizing the initial appliance configuration such as the YAML script and networking settings. In order to prepare the YAML script, we need to have in hand the wildcard certificate the public domain used that will have the wildcard DNS record pointing to the public IP that is natted to the internal IP of the SGA.

In order to get the wildcard certificate private key and export the certificate from a .pfx file, follow the following procedure. If you have a PFX that you would like to convert to PEM cert and PEM key (without password) just download OpenSSL-Win32:

  1. Open CMD and run the following command:
    set OPENSSL_CONF=c:\OpenSSL-Win32\bin\cnf\openssl.cnf

  2. Copy the wildcard cert into the the folder C:\OpenSSL-Win32\bin

  3. CMD to C:\OpenSSL-Win32\bin

  4. openssl pkcs12 -in wildcard.pfx -nocerts -out key.pem –nodes

  5. openssl pkcs12 -in C:\OpenSSL-Win32\bin\wildcard.pfx -nocerts -out key.pem –nodes

  6. openssl pkcs12 -in C:\OpenSSL-Win32\bin\wildcard.pfx -nokeys -out cert.pem

  7. openssl rsa -in key.pem -out server.key

cert.pem is your certificate  and server.key is your private key ( without a passphrase ). Since this is probably a publicly trusted wildcard cert, you also need to export the intermediate and root certificates. Open the certificate, navigate to Certification path, view the intermediate and root certificates respectively and for each, copy to file as Base-64 encoded X.509 .

image

image

image

image

image

Now that we have all the required variables for the YAML file, copy the certificates from top being Root and bottom being the actual certificate then include the public domain wildcard DNS entry and the network range that is hosting the virtual workloads that require remote connectivity. Download the YAML structure file here to save some time and change the required variables certificates, domain, and CIDR .

image

Import the Frame SGA ISO into the AHV cluster image configuration and deploy a new virtual machine with 2 vCPU, 4GB RAM, attach the SGA ISO to the virtual drive, add a network (DMZ) and apply the YAML script by pasting it into the custom script tab. If the VM fails to create, it is because the YAML script is not structured and pasted correctly so make sure the structure is fine as per the YAML script template. Finally turn on the SGA appliance and change the network settings incase of no DCHP if needed ( remember to reserve DHCP since a NAT will be created pointing to this IP ).

image

image

image

image

image

Type “netconfig” in order to change the network settings for the appliance and “logs” or “livelogs” can be used to monitor the status of the services on the appliance. Make sure you de-activate then re-activate the NIC after any configuration change so that it is reflected by the appliance.

image

image

Now would be the time to create the public wildcard DNS entry with your public domain provider pointing to a public IP, create a NAT rule for TCP 443 to be forwarded to the SGA and make sure all required ports listed in requirements are open.

image

image

image

Nothing more required configuration wise except opening a support case with Xi Frame team for enabling remote connectivity to your account. The end-user experience internally and externally is exactly the same in terms of accessing the Xi Frame portal so same URL applies. One thing I could not determine if when SGA is enabled, all connections are now routed through SGA even internal ones, I am pretty sure that is the case so that needs to be addressed by Nutanix Xi Frame personnel by maybe overriding internal IP ranges.

Conclusion

Remote connectivity is a crucial part of any VDI offering and not having to rely on third party access gateways for the same is a step in the right direction for Nutanix Xi Frame. I wonder how will frame position itself in the digital workspace arena and if any vendor that being Citrix and/or VMware would allow Mobility/Airwatch/Identity Manager/Workspace to integrate with Frame VDI on the long run or are we going to see a Nutanix acquisition for UEM and entry to that market segment ? No one every knows what they are thinking. Allow me to quote myself: “ If VMware has it, expect Nutanix to be thinking about it “, its as simple as that.

Part 1: Nutanix Xi Frame On-Prem AHV: What Does it Mean to Citrix, VMware and The EUC Community !

Part 2: Deploy & Configure Xi Frame VDI on On-Premises AHV

Part 3: Enhance Nutanix Xi Frame User Experience Domain Settings Optimization On-premises AHV

Part 4: Configure Nutanix Xi Frame Enterprise Profiles For VDI On AHV

Part 5: Configure Nutanix Xi Frame Streaming Gateway Appliance for AHV VDI

May the Peace, Mercy, and Blessing of God Be Upon You

Leave a Reply

Your email address will not be published. Required fields are marked *