When I heard that Spotify was coming to the United States I thought, “What’s the big deal”? With such a question in hand, I when to the site to get notified when the service was available. I was lucky enough to get approved for early access only a few days later. After using the free version of the service, I quickly saw how it could be a game changer.
First off, I’ll point out that I’ve been using the Microsoft Zune services since it came out. Full Disclosure: I worked for Microsoft at the time Zune came out. Regardless, it’s a quality product.
To keep this post short, I’ll get to the basics.
Zune Pass: http://zune.net
- User Interface is easy and beautiful.
- Great integration with Windows Phone (I have a test phone and it works like a charm), XBOX 360, PC, and Zune HD (cool but limited device)
- Each month I get to keep 10 songs via MP3, everything else is protected (WMA format) but as long as I keep paying the subscription I can get as much as I want.
- Free version allows for (free) streaming of music. (Awesome)
- Paid versions allow access via you mobile device. I have an iPhone and it’s pretty nice. It also works on other mobile platforms such as Windows Phone and Android.
- Streaming is super fast.
Winner: To be determined, but right now I’m starting to think I’ll go the way of Spotfiy unless Zune makes a few changes. If Spotify can get away with the free service, why can’t Zune? Plus I like being able to keep 10 songs via MP3 each month.
Today was a great day at the beach. I think we spent about 4 or 5 hours there enjoying the waves, sunshine, and hanging out with some of our old neighbors yet still very close friends. Today I learned how to Skimboard. I wasn’t that great at it, but it was my first day and loads of fun.
You know, as the kids get older, I find it more fun everyday doing things outside. I find that they generally want us parents around and doing cool stuff is always a plus. Take this video stills from some of the footage we shot today.
They loved getting involved and making something cool. They’re the main starts of my planned summer time movie. I’m getting great stuff from them. Sometimes I wish we had the camera rolling all the time, because their best stuff comes in unplanned moments. Come to think of it, so does their worse.
While this is the same blog URL (http://agramont.net) that I’ve been blogging from for a while, I’m taking a new direction in how and where I blogged. In the past my blog has been more about professional topics and content. While this was a great way to get out interesting information about the products and/or services I worked on, it didn’t give me a good place to share other types of information. While I did use spaces like Facebook to share very personal content, that content could never really be shared with the awesomeness of the Internet world. Plus, my previous blogging was done on a shared hosting service (http://ASPnix.com ) and I had to both deploy and maintain the blogging software (Community Server) which took too much of my time. So now I’ve moved over to WordPress.
So where is all of that previous content? Well for now, it’s nowhere. But I do have plans to get the most popular content and stick in this sites new archive space.
So here are my new rules for content:
- Post all Corporate/Business content on my companies blog site. It’s really the right place to put this information. Plus, if I move on , I don’t have to maintain that content or service.
- Post all really personal information on Facebook. Facebook for now, although I’m starting to fall out of love with them because of their weak and inflexible security model. They clearly haven’t learned from other big players such as Microsoft and Google. But I haven’t found a better replacement.
- Picture or Video for all posts. Having a blog post with just text is getting pretty old. It’s more fun when there is a picture or video to help tell the “Story” and give some visual stimulation. Thanks to the Neistat Brothers for the inspiration (more on them later).
- No topic is off limits. It’s my blog and I can blast if I want to (within reason). I’ve been holding back for a long time many of my views on politics, products, and other stupidness. I’d like this to be my truly personal space to document that. Is it a risk? Sure, but sometimes you gotta have your voice heard….right?
Listen, Think, and Share. I’ve found that some of the best blogs listen to their readers and have an open and honest dialog. I’d like to try that out, but first I’ll need readers. Like you! (right?)
OK, enough of that…. Let’s have fun now!
Here’s a super short video of my lovely wife Pam “striking out”
A few years ago the United States Federal Government, specifically the U.S. Office of Management and Budget) created a PC standard for then entire government to follow. The provided over 300 settings for Windows XP and Windows Vista in order to create a standard for all computers. This is what is now knows as Federal Desktop Core Configuration (FDDC). There is a ton of resources on the Internet, mostly on the .gov sites, that provides guidance on what these settings are and how to audit those settings using publicly available tools.
As with any IT Department, defining the policy is one major leap. But to some degree, that’s the easy part. Now you must deploy that configuration and ensure it stays enforced, not to mention audited and reported on. With the U.S Government, having a mandate from the OMB is pretty powerful, thus making this problem space even more critical.
The FDCC is a perfect fit for Virtual Desktops from a deployment and management perspective. Virtual Desktops is all about OS and Application standardization and consistency. Thinking of having a pool of available OS instances, just waiting for a user to login from a remote device which could be a hardened thin-client or legacy PC. All of those OS instances are based on a “Master Image” that has been fully configured with the FDCC policies. When a user logs in, all of their applications are delivered via “Application Virtualization” (e.g. Microsoft App-V or Citrix XenApp) which is still abstracted from the underlying “Master Image”, thus keeping the desktop within FDCC standards. All of the users data and application data is stored on a centralized store (e.g. SAN) which again keeps the “Master Image” clean of user data and provides additional benefits for the user and IT (e.g. daily backups of all user data).
So what about those users that go on the road? Well this is where Virtual Desktop is still in play. Using Microsoft MED-V or Citrix XenDesktop, a user can still take their FDCC approved image and applications on the road with them. The bonus about Virtual Desktop deployments is that the process and image based deployments can be done directly on a physical machine as well. You just take that master image, settings, and even application virtualization and deploy it directly on a laptop. Using something like Microsoft System Center Configuration Manager and the Microsoft Deployment Toolkit (a solution accelerator) delivers this type of deployment scenario for both virtual and physical deployments.
Just like in any Virtual Desktop deployment, it’s not like Server Virtualization! Managing the deployment and operations for a Virtual Desktop Infrastructure (VDI) is extremely different and requires lots of up front planning. Not to say that Server Virtualization doesn’t, but when you consider the number of different users actually logging onto those Virtual Desktops, there are lots of end user scenarios you have to think through. Even with the guidance of the OMB for FDCC (see, here comes the acronym soup), you may still define additional policies for given user roles. Which could include access to applications via a variety of delivery models (e.g. web applications, application virtualization, etc.)
Today on one of Citrix’s Blogs, they announced a new upcoming technology called “Hosted Virtual Machines” (HVM). As if the Virtualization Soup of technology wasn’t big enough already, but this does solve an interesting problem. Without much more information on the subject, here is my take.
Short Version: You want to host an application on a managed VM, but Terminal Services won’t work for a number of reasons. With HVM, you use a client OS such Windows XP to run the application, but the presentation of it (just like Terminal Services) is then sent to the user.
Long Version: It seems that “Virtualization” is getting more and more attached to every new technology, but at the end of the day it’s about access to applications (that includes the OS and other applications). Let’s put aside the delivery of an OS for now and focus just on the application. There are a number of ways to provide a user with access to an application.
- Traditional – This is where you get a CD or copy files from a file share and install the application locally
- Terminal Services – Based on using a single OS instance, such as Windows Server 2008, and allowing multiple users to logon at one-time, but they each have their own “space”/desktop. The display of that OS, or sometimes just a given application, is presented to the user. Everything runs on the server, but show to the user on their computer
- Application Virtualization – There are a few flavors of this. This simplest view is about delivery. The application is “preinstalled” and “captured” on a given OS (do a traditional install, but all files, registry settings, shortcuts, etc. are captured) and then deployed to any number of users. So one “install” is then executed on any number of computers. The application will run on the local computer, BUT it’s not installed there. No files, registry, or shortcuts are anywhere to be found on your computer, but it still works locally. That’s the virtual part. Again, it’s all about deployment.
The big issue here is the ability to still provide “Terminal Services” like deployment of applications, but overcome some of the issues that “Terminal Services” (TS) has. What kind of TS issues? Well TS is still a Server OS. It doesn’t have may of the client components (e.g. Windows 7) that some applications require. TS is also Multi User based and there are some application that don’t work there either.
So why can’t Application Virtualization (e.g. Microsoft App-V or Citrix XenApp) work? First off, there are certain applications that are developed either by a custom software development shop and built for a given customer and for a given OS/Application mix. There are other applications that are certified by an Independent Software Vendor (ISV) that has specific requirements. And then there are organizations like the Government, Health Care, and more that need to ensure that certain applications and data behave in a given way. For all of these scenarios, an IT shop may want to provide an application to their users, but refrain from deploying them locally, it won’t work via TS, Application Virtualization won’t fulfill their requirements, and whatever else.
So the solution by Citrix XenApp (in the future) opens some very interesting doors. I don’t think it will be part of the mass adoption, but it will break down certain barriers.
This leads me to think of other solutions such as Microsoft MED-V (Part of MDOP) and MokaFive that provide this kind of host based virtualization, although with HVM Citrix also allows this to be hosted on a server. I guess I’ll have to wait for a Citrix demo and trial for me to learn more.
BTW, I wonder how this will impact hosters looking to get into the application delivery model. Since this does require another client OS, Citrix rightfully notes that you’ll need the Microsoft VECD license. Too bad VECD is not on the Microsoft SPLA list. Bummer
I just read the announcement that Microsoft put out about their new licensing model for Virtual Desktop Infrastructure (VDI):
If you’ve ever had to figure out Microsoft licensing for any type of business use, you’ll know how complex (and frustrating) it can be. There’s plenty of good reasons why it’s so complicated. With the VDI scenario, it makes licensing even harder since you’re not just talking end devices anymore (e.g. your laptop), but you’re also dealing with many virtual components (e.g. virtual applications deployed on a virtual desktop, deployed on a virtual server, deployed on a thin-client).
So based on the new announcement, when it comes to doing an all Microsoft VDI solution, what licensing components do you need to keep in mind?
- Microsoft Virtual Desktop Infrastructure Standard Suite (VDIS) – This is the “Platform” license. It covers all of the licenses you need to run a complete Microsoft solution for VDI. It spans Virtual OS (Hyper-V), Management (System Center + MDOP), and Server CAL (Remote Desktop). (Premium Suite includes additional rights for Session Based Remote Desktop….Formerly named Terminal Services)
- Microsoft Windows Virtual Enterprise Centralized Desktop (VECD) – This is actual client OS license. It’s covered per device (e.g. the thin-client that you connect from) and allows you to run up to 4 OS instances from that device (which can be spread across any number of servers). Even if your NOT using Microsoft for your Hypervisor and/or management (e.g. you’re using VMware or Citrix), you MUST still purchase this license.
So what does this mean from a cost perspective? Both of the licenses above are priced on a per device (e.g. thin-client or “Legacy” PC) on a per year basis.
- VDIS Standard: $21.00 (US) per year
- VECD for SA: $23.00 (US) per year – This is if your device is a Windows Client OS that ALREADY has Software Assurance on it.
- VECD: $110 per year. – This is for a traditional Thin-Client.
So why does Microsoft do a VECD license in the first place? If you look at the license of the Windows Client OS (like I’m sure we all do), you’ll notice that the license is perpetual. So where you install it, it must stay there. Not only that, I need an OS license for each OS Instance I use. With VECD, I don’t have that same headache. The IT department can deploy any number of combinations of Windows XP, Windows Vista, and Windows 7 for specific role, tasks, training, or whatever. It doesn’t need to track the total number of virtual OS instances for licensing as the OS license is being tracked by the number of end devices using a given image. Now this doesn’t mean that an IT department will deploy thousands of images (what a headache) as there are better ways to use “golden” images and to dynamically deploy new Virtual Machines to a “Pool” of available clients (Future Post!!). But this does free up the IT department to provide OS Instances and Applications on demand for customers because VECD covers them to do so! Again, this is something that is a MUST for ALL VDI DEPLOYMENTS no matter what vendor you use for Virtual Desktop.
I think the license change from Microsoft will make it MUCH easier for customers to budget for Virtual Desktop using the Microsoft platform.
ISV Guidelines for Hosted Microsoft Dynamics CRM 4.0 Part 1
The intent of this series of blogs is to provide a basic guideline for Service Providers looking to offer CRM as a target platform for ISV’s looking to deploy their application on the Internet as a Software as a Service (SaaS) model. It’s also a guide for ISV’s to understand that the design decisions they make during development will have a profound impact on their available hosting with regards to deployment architectures and pricing.
- Part 1: Introduction
- Part 2: CRM as an Application or Platform
- Part 3: Shared or Virtualization Deployment & Licensing
- Part 4: Provisioning & Control Panels
- Part 5: Making the Leap
With the release of Microsoft Dynamics CRM 4.0 (MSCRM4), Microsoft has provided not only a great CRM application, but also a business application platform. There are many software vendors and consulting organizations that have already leveraged MSCRM4 in the traditional deployment where the application or solution is installed locally on a customers server. This scenario is typically called “On-Premise”. While this deployment model works great for some customers, many business departments are looking to gain access to applications that improve their business, but without the hassle and cost of deployment and operations within their IT department. It’s not that an IT department can’t handle new applications, but it takes time, money, and knowledge to add a new application into the business. Business applications that are hosted on the Internet and accessible via a traditional browser is known as “Software as a Service” (SaaS). The business world is all abuzz about SaaS and it’s potential impact to deliver rich applications to departments, on-demand, with a monthly fee, and without the need for upfront deployment or hardware costs. Sounds Great, right? Well for some scenarios it is pretty great, but there are a number of other reasons why this might not be so hot (e.g. Security, Internet Outage, Performance, End User Training, etc.).
Here are some examples of where the SaaS deployment model is so interesting for many customers:
- Trial – Your software may be great and the value is high, but how will the customer know if they can’t try it in all it’s glory? Sure they could download the software and use it, but not it requires hardware, time, and the knowledge to get it installed and configured. With the SaaS based model, they can get access to your application instantly! Even if they are interested in an on-premise deployment, they can at least get the feel for it right away which will help with their buying decision.
- Temporary Usage – Some customers may decide that they do want the on-premise version. This could be for any number of reasons the customer may have or because your on-premise version has more capabilities (e.g. integration with a VOIP solution, devices, etc.) than the SaaS version. In this scenario, the customer goes beyond the trial online and wants to continue to use it. Let’s say it’s going to take six months for the customers IT team to purchase, deploy, and operationalize [killing the English language] an environment for the on-premise version. So until then, the customer uses the SaaS version. This gives the customer some flexibility in their deployment, instant access to the application which will improve their business, and increases your sales and revenue.
- Migrations – I’m sure you’d seen a number of customers that would LOVE to go to a new version of a software application they’ve been using, but the time and cost to upgrade hardware, update the data, and learn the new platform is just too much for them. This is another great scenarios for SaaS to meet the business needs of the customer, removing the strain on their IT department, and increasing revenue for you (SaaS Vendor).
- SaaS Everything – There is a growing trend for many organizations to outsource more and more of their applications. Well, that’s what the industry says at least. For those businesses, you at least need to have SaaS as a delivery option for them or you may lose some business.
There are a number of Service Providers out there today that are offering hosted solutions for Microsoft Exchange Server (for consumer and business mail), Windows Server (for web hosting with Internet Information Server which is part of Windows Server), SQL Server (for databases), and SharePoint Services (for document and information collaboration). MSCRM4 is a natural extension for Service Providers to also offer this service. While there is much competition in the space of CRM systems on the Internet, including the current leader Salesforce.com, MSCRM4 is easily configurable, extensible, and leverages the Microsoft .NET Framework which will enable the army of Microsoft developers hooked on their Microsoft Visual Studio development environment to build rich business applications.
When developing software, the sky is the limit! Especially when developing on the Microsoft platform and technologies, but you must be careful that you follow some basic guidelines to ensure your application can be hosted as a SaaS application and meet your target business objectives. There is much to consider and I hope you find the rest of this series helpful.
Note: If there are specific areas you’d like me to cover in future posts, please post a comment below.