Archive

Author Archive

Resolving Amazon Echo registration error 10:2:5:60:1

January 29, 2017 Leave a comment

When trying to register a new “Amazon Echo” into my home network, I received the following error:

There was an error registering your device.  Visit Help for troubleshooting tips.  Error 10:2:5:60:1

I followed all of the usual steps (reset device, reset router, uninstall/reinstall mobile app, etc.), but nothing worked. I also setup a second Wi-Fi network to see if that was the issue.

I should point out that I already had another working Amazon Echo Dot device that was previously setup and still working on the same WiFi network that didn’t have any issues registering. I also took that offline to ensure that wasn’t an issue.

I then removed all other WiFi devices from the network.  It still didn’t work.

I called Amazon support. They were great, but it still didn’t work.  So they sent me a new one and received it in two days.  They’re awesome.

What wasn’t awesome was that the new device had the SAME ERROR!  So now I know it IS the network.

What I found was that with all devices off of the network, I check the logs (Advanced > Logs) on my Cable Modem device (Netgear c6300, Firmware V2.01.14) that I saw that the device was reporting a DoS attack that were coming from the Echo device.  Below is the error I saw:

[DoS attack] Port Scan PROTO:UDP SPT:50395 DPT:123

The firewall didn’t have any restrictions on the devices that could be on the network, nor blocking outbound traffic, so this was weird.  I found a setting to ignore this type of traffic and not block it.  I crossed my fingers and gave it a try.

First, login to your device (typically http://192.168.0.1/ ), then click on the “Advanced” tab.  Click “Setup” in the left menu, then click on “WAN Setup”.  Enable the Checkbox for, “Disable Port Scan and DoS Protection”, then click “Apply”.

Once your device has reset, then go through the normal setup process using Amazon Echo.  Now your Echo should be working.  After it’s working, go back into your Netgear setting and disable the setting we applied earlier.  Afterward, the Echo should still work AND you’re protected from Port Scan and DoS protection.

I hope this helps you in your situation.

Categories: Amazon Alexa, IoT, Technology

To Zune or to Spotify… That’s the question

July 26, 2011 Leave a comment

When I heard that Spotify was coming to the United States I thought, “What’s the big deal”?  With such a question in hand, I when to the site to get notified when the service was available.  I was lucky enough to get approved for early access only a few days later. After using the free version of the service, I quickly saw how it could be a game changer.

First off, I’ll point out that I’ve been using the Microsoft Zune services since it came out.  Full Disclosure: I worked for Microsoft at the time Zune came out.  Regardless, it’s a quality product. 

To keep this post short, I’ll get to the basics.

Zune Pass: http://zune.net

  1. User Interface is easy and beautiful.
  2. Great integration with Windows Phone (I have a test phone and it works like a charm), XBOX 360, PC, and Zune HD (cool but limited device)
  3. Each month I get to keep 10 songs via MP3, everything else is protected (WMA format) but as long as I keep paying the subscription I can get as much as I want.

Spotify: http://www.spotify.com

  1. Free version allows for (free) streaming of music.  (Awesome)
  2. Paid versions allow access via you mobile device.  I have an iPhone and it’s pretty nice.  It also works on other mobile platforms such as Windows Phone and Android.
  3. Streaming is super fast.

Winner: To be determined, but right now I’m starting to think I’ll go the way of Spotfiy unless Zune makes a few changes.  If Spotify can get away with the free service, why can’t Zune?  Plus I like being able to keep 10 songs via MP3 each month.

Categories: Mobile, Technology

Great day at the beach

August 1, 2010 Leave a comment

Today was a great day at the beach. I think we spent about 4 or 5 hours there enjoying the waves, sunshine, and hanging out with some of our old neighbors yet still very close friends. Today I learned how to Skimboard. I wasn’t that great at it, but it was my first day and loads of fun.

You know, as the kids get older, I find it more fun everyday doing things outside. I find that they generally want us parents around and doing cool stuff is always a plus. Take this video stills from some of the footage we shot today.

They loved getting involved and making something cool. They’re the main starts of my planned summer time movie. I’m getting great stuff from them. Sometimes I wish we had the camera rolling all the time, because their best stuff comes in unplanned moments. Come to think of it, so does their worse.

Categories: Beach, Family

Conrad Blogging 3.0

July 30, 2010 Leave a comment

While this is the same blog URL (http://agramont.net) that I’ve been blogging from for a while, I’m taking a new direction in how and where I blogged.  In the past my blog has been more about professional topics and content.  While this was a great way to get out interesting information about the products and/or services I worked on, it didn’t give me a good place to share other types of information.  While I did use spaces like Facebook to share very personal content, that content could never really be shared with the awesomeness of the Internet world.  Plus, my previous blogging was done on a shared hosting service (http://ASPnix.com ) and I had to both deploy and maintain the blogging software (Community Server) which took too much of my time.  So now I’ve moved over to WordPress.

So where is all of that previous  content?  Well for now, it’s nowhere.  But I do have plans to get the most popular content and stick in this sites new archive space.

So here are my new rules for content:

  1. Post all Corporate/Business content on my companies blog site.  It’s really the right place to put this information.  Plus, if I move on , I don’t have to maintain that content or service. 
  2. Post all really personal information on Facebook.  Facebook for now, although I’m starting to fall out of love with them because of their weak and inflexible security model.  They clearly haven’t learned from other big players such as Microsoft and Google.  But I haven’t found a better replacement.
  3. Picture or Video for all posts.  Having a blog post with just text is getting pretty old.  It’s more fun when there is a picture or video to help tell the “Story” and give some visual stimulation.  Thanks to the Neistat Brothers for the inspiration (more on them later).
  4. No topic is off limits.  It’s my blog and I can blast if I want to (within reason).  I’ve been holding back for a long time many of my views on politics, products, and other stupidness.  I’d like this to be my truly personal space to document that.  Is it a risk?  Sure, but sometimes you gotta have your voice heard….right?
    Listen, Think, and Share.  I’ve found that  some of the best blogs listen to their readers and have an open and honest dialog.  I’d like to try that out, but first I’ll need readers.  Like you! (right?)

OK, enough of that…. Let’s have fun now!

Here’s a super short video of my lovely wife Pam “striking out”

Categories: News

Virtual Desktops and Federal Desktop Core Configuration (FDCC)

July 31, 2009 Leave a comment

A few years ago the United States Federal Government, specifically the U.S. Office of Management and Budget) created a PC standard for then entire government to follow.  The provided over 300 settings for Windows XP and Windows Vista in order to create a standard for all computers.  This is what is now knows as Federal Desktop Core Configuration (FDDC).  There is a ton of resources on the Internet, mostly on the .gov sites, that provides guidance on what these settings are and how to audit those settings using publicly available tools.

As with any IT Department, defining the policy is one major leap.  But to some degree, that’s the easy part.  Now you must deploy that configuration and ensure it stays enforced, not to mention audited and reported on.  With the U.S Government, having a mandate from the OMB is pretty powerful, thus making this problem space even more critical.

The FDCC is a perfect fit for Virtual Desktops from a deployment and management perspective.  Virtual Desktops is all about OS and Application standardization and consistency.  Thinking of having a pool of available OS instances, just waiting for a user to login from a remote device which could be a hardened thin-client or legacy PC.  All of those OS instances are based on a “Master Image” that has been fully configured with the FDCC policies.  When a user logs in, all of their applications are delivered via “Application Virtualization” (e.g. Microsoft App-V or Citrix XenApp) which is still abstracted from the underlying “Master Image”, thus keeping the desktop within FDCC standards.  All of the users data and application data is stored on a centralized store (e.g. SAN) which again keeps the “Master Image” clean of user data and provides additional benefits for the user and IT (e.g. daily backups of all user data).

So what about those users that go on the road?  Well this is where Virtual Desktop is still in play.  Using Microsoft MED-V or Citrix XenDesktop, a user can still take their FDCC approved image and applications on the road with them.  The bonus about Virtual Desktop deployments is that the process and image based deployments can be done directly on a physical machine as well.  You just take that master image, settings, and even application virtualization and deploy it directly on a laptop.  Using something like Microsoft System Center Configuration Manager and the Microsoft Deployment Toolkit (a solution accelerator) delivers this type of deployment scenario for both virtual and physical deployments.

Just like in any Virtual Desktop deployment, it’s not like Server Virtualization!  Managing the deployment and operations for a Virtual Desktop Infrastructure (VDI) is extremely different and requires lots of up front planning.  Not to say that Server Virtualization doesn’t, but when you consider the number of different users actually logging onto those Virtual Desktops, there are lots of end user scenarios you have to think through.  Even with the guidance of the OMB for FDCC (see, here comes the acronym soup), you may still define additional policies for given user roles.  Which could include access to applications via a variety of delivery models (e.g. web applications, application virtualization, etc.)

Categories: Technology, Virtualization

Hosted Virtual Machines with XenApp

July 30, 2009 Leave a comment

Today on one of Citrix’s Blogs, they announced a new upcoming technology called “Hosted Virtual Machines” (HVM).  As if the Virtualization Soup of technology wasn’t big enough already, but this does solve an interesting problem.  Without much more information on the subject, here is my take.

Short Version: You want to host an application on a managed VM, but Terminal Services won’t work for a number of reasons.  With HVM, you use a client OS such Windows XP to run the application, but the presentation of it (just like Terminal Services) is then sent to the user.

Long Version: It seems that “Virtualization” is getting more and more attached to every new technology, but at the end of the day it’s about access to applications (that includes the OS and other applications).  Let’s put aside the delivery of an OS for now and focus just on the application.  There are a number of ways to provide a user with access to an application.

  1. Traditional – This is where you get a CD or copy files from a file share and install the application locally
  2. Terminal Services – Based on using a single OS instance, such as Windows Server 2008, and allowing multiple users to logon at one-time, but they each have their own “space”/desktop.  The display of that OS, or sometimes just a given application, is presented to the user.  Everything runs on the server, but show to the user on their computer
  3. Application Virtualization – There are a few flavors of this.  This simplest view is about delivery.  The application is “preinstalled” and “captured” on a given OS (do a traditional install, but all files, registry settings, shortcuts, etc. are captured) and then deployed to any number of users.  So one “install” is then executed on any number of computers.  The application will run on the local computer, BUT it’s not installed there.  No files, registry, or shortcuts are anywhere to be found on your computer, but it still works locally.  That’s the virtual part.  Again, it’s all about deployment.

The big issue here is the ability to still provide “Terminal Services” like deployment of applications, but overcome some of the issues that “Terminal Services” (TS) has.  What kind of TS issues?  Well TS is still a Server OS.  It doesn’t have may of the client components (e.g. Windows 7) that some applications require.  TS is also Multi User based and there are some application that don’t work there either.

So why can’t Application Virtualization (e.g. Microsoft App-V or Citrix XenApp) work?  First off, there are certain applications that are developed either by a custom software development shop and built for a given customer and for a given OS/Application mix.  There are other applications that are certified by an Independent Software Vendor (ISV) that has specific requirements.  And then there are organizations like the Government, Health Care, and more that need to ensure that certain applications and data behave in a given way.  For all of these scenarios, an IT shop may want to provide an application to their users, but refrain from deploying them locally, it won’t work via TS, Application Virtualization won’t fulfill their requirements, and whatever else.

So the solution by Citrix XenApp (in the future) opens some very interesting doors.  I don’t think it will be part of the mass adoption, but it will break down certain barriers.

This leads me to think of other solutions such as Microsoft MED-V (Part of MDOP) and MokaFive that provide this kind of host based virtualization, although with HVM Citrix also allows this to be hosted on a server.  I guess I’ll have to wait for a Citrix demo and trial for me to learn more.

BTW, I wonder how this will impact hosters looking to get into the application delivery model.  Since this does require another client OS, Citrix rightfully notes that you’ll need the Microsoft VECD license.  Too bad VECD is not on the Microsoft SPLA list.  Bummer

Categories: Uncategorized

Microsoft VDI Suites Licensing

July 14, 2009 Leave a comment

I just read the announcement that Microsoft put out about their new licensing model for Virtual Desktop Infrastructure (VDI):

http://blogs.technet.com/virtualization/archive/2009/07/13/Microsoft_1920_s-new-VDI-licensing_3A00_-VDI-Suites.aspx

If you’ve ever had to figure out Microsoft licensing for any type of business use, you’ll know how complex (and frustrating) it can be.  There’s plenty of good reasons why it’s so complicated.  With the VDI scenario, it makes licensing even harder since you’re not just talking end devices anymore (e.g. your laptop), but you’re also dealing with many virtual components (e.g. virtual applications deployed on a virtual desktop, deployed on a virtual server, deployed on a thin-client).

So based on the new announcement, when it comes to doing an all Microsoft VDI solution, what licensing components do you need to keep in mind?

  1. Microsoft Virtual Desktop Infrastructure Standard Suite (VDIS) – This is the “Platform” license.  It covers all of the licenses you need to run a complete Microsoft solution for VDI.  It spans Virtual OS (Hyper-V), Management (System Center + MDOP), and Server CAL (Remote Desktop).  (Premium Suite includes additional rights for Session Based Remote Desktop….Formerly named Terminal Services)
  2. Microsoft Windows Virtual Enterprise Centralized Desktop (VECD) – This is actual client OS license.  It’s covered per device (e.g. the thin-client that you connect from) and allows you to run up to 4 OS instances from that device (which can be spread across any number of servers). Even if your NOT using Microsoft for your Hypervisor and/or management (e.g. you’re using VMware or Citrix), you MUST still purchase this license.

So what does this mean from a cost perspective?  Both of the licenses above are priced on a per device (e.g. thin-client or “Legacy” PC) on a per year basis.

  1. VDIS Standard: $21.00 (US) per year
  2. VECD for SA: $23.00 (US) per year – This is if your device is a Windows Client OS that ALREADY has Software Assurance on it.
  3. VECD: $110 per year. – This is for a traditional Thin-Client.

So why does Microsoft do a VECD license in the first place?  If you look at the license of the Windows Client OS (like I’m sure we all do), you’ll notice that the license is perpetual.  So where you install it, it must stay there.  Not only that, I need an OS license for each OS Instance I use.  With VECD, I don’t have that same headache.  The IT department can deploy any number of combinations of Windows XP, Windows Vista, and Windows 7 for specific role, tasks, training, or whatever.  It doesn’t need to track the total number of virtual OS instances for licensing as the OS license is being tracked by the number of end devices using a given image.  Now this doesn’t mean that an IT department will deploy thousands of images (what a headache) as there are better ways to use “golden” images and to dynamically deploy new Virtual Machines to a “Pool” of available clients (Future Post!!).  But this does free up the IT department to provide OS Instances and Applications on demand for customers because VECD covers them to do so!  Again, this is something that is a MUST for ALL VDI DEPLOYMENTS no matter what vendor you use for Virtual Desktop.

I think the license change from Microsoft will make it MUCH easier for customers to budget for Virtual Desktop using the Microsoft platform.

Categories: Uncategorized