Ask any organization what's holding up some aspect of their full cloud migration and you're bound to hear "legacy Windows applications" as a common answer.
Anytime an organization migrates to the cloud, it still needs to provide its people with access to the critical legacy Windows applications that they need to stay productive - even if those apps haven't been web-enabled. Using expensive and complex Virtual Desktop Infrastructure (VDI) or traditional application virtualization products is overkill - not to mention far too costly and complex -...
Every IT admin knows that if they aren’t able to provide their people with a good user experience, then those people are going to point the finger right back at IT and blame them for that poor experience. One area where this finger-pointing is notoriously prevalent occurs during the use of virtual desktops to enable virtual application delivery.
I recently came across a brilliant blog post from Brian Madden at VMware entitled “How to convince your users that VDI is good for them.” The post makes a particularly salient point at the end, which is this: “Only use VDI where it makes sense!”