Archive
#Gartner report – How to Choose Between #Hyper-V and #vSphere – #IaaS
The constant battle between the hypervisor and orchestration of IaaS etc. is of course continuing! But it is really fun I must say that Microsoft is getting more and more mature with it’s offerings in this space, great job!
One of the things that I tend to think most of is the cost, scalability and flexibility of the infrastructure that we build and how we build it, I often see that we tend to do what we’ve done for so many years now. We buy our SAN/NAS storage, we buy our servers but lean towards Blade servers though we think that’s the latest and coolest, and then we try to squeeze that into some sort of POD/FlexPods/UCS or whatever we like to call it to find our optimal “volume of Compute, Network and Storage” that we can scale. But is this scalable like the bigger cloud players like Google, Amazon etc.? Is this 2013 state of the art? I think that we’re just fooling ourselves a bit and build whatever we’ve done for all these years and don’t really provide the business with anything new… but that’s my view… I know what I’d look at and most of you that have read my earlier blog posts know that I love the way of scaling out and doing more like the big players using something like Nutanix and ensure that you choose the right IaaS components as a part of that stack, as well as the orchestration layer (OpenStack, System Center, CloudStack, Cloud Platform or whatever you prefer after you’ve done your homework).
Back to the topic a bit, I’d say that the hypervisor is of no importance anymore, that’s why everyone if giving it away for free or to the open source community! Vendors are after the more IaaS/PaaS orchestration layer and get into that because if they get that business then they have nested their way into your business processes, that’s where ultimately that will deliver the value as IT services in an automated way once you’ve got your business services and processes in place, and then it’s harder to make a change and they will live fat and happy on you for some years to come! 😉
#Windows #Azure Desktop Hosting Deployment Guide – #RDS, #BYOD – via @michael_keen
This is great! Have a look at this guide!
Hello everyone, this is Clark Nicholson from the Remote Desktop Virtualization Team. I’m writing today to let you know that we have just published the Windows Azure Desktop Hosting Deployment Guide. This document provides guidance for deploying a basic desktop hosting solution based on the Windows Azure Desktop Hosting Reference Architecture Guide. This document is intended to provide a starting point for implementing a Desktop Hosting service on Windows Azure virtual machines. A production environment will need additional deployment steps to provide advanced features such as high availability, customized desktop experience, RemoteApp collections, etc.
For more information, please see Remote Desktop Services and Windows Azure Infrastructure Services.
Continue reading here!
//Richard
#Microsoft Desktop Hosting Reference Architecture Guides
Wow, these are some compelling guides that Microsoft delivered!! Have a look at them! But of course there’s always something more U want! Let Service Providers provide DaaS services based on client OS’s as well!!!
![]() |
Microsoft has released two papers related to Desktop Hosting. The first is called: “Desktop Hosting Reference Architecture Guide” and the second is called: “Windows Azure Desktop Hosting Reference Architecture Guide“. Both documents provide a blueprint for creating secure, scalable, multi-tenant desktop hosting solutions using Windows Server 2012 and System Center 2012 SP1 Virtual Machine Manager or using Windows Azure Infrastructure Services.
The documents are targeted to hosting providers which deliver desktop hosting via the Microsoft Service Provider Licensing Agreement (SPLA). Desktop hosting in this case is based on Windows Server with the Windows Desktop Experience feature enabled, and not Microsoft’s client Operating Systems like Windows 7 or Windows 8.
For some reason, Microsoft still doesn’t want service providers to provide Desktops as a Service (DaaS) running on top of a Microsoft Client OS, as outlined in the “Decoding Microsoft’s VDI Licensing Arcanum” paper which virtualization.info covered in September this year.
The Desktop Hosting Reference Architecture Guide provides the following sections:
- Desktop Hosting Service Logical Architecture
- Service Layer
- Tenant Environment
- Provider Management and Perimeter Environments
- Virtualization Layer
- Hyper-V and Virtual Machine Manager
- Scale-Out File Server
- Physical Layer
- Servers
- Network
- Tenant On-Premises Components
- Clients
- Active Directory Domain Services
The Windows Azure Desktop Hosting Reference Architecture covers the following topics:
#Microsoft launches its #Azure #Hadoop service! – via @maryjofoley
This is really cool!
Microsoft’s cloud-based distribution of Hadoop — which it has been developing for the past year-plus with Hortonworks — is generally available as of October 28.
Microsoft officials also are acknowledging publicly that Microsoft has dropped plans to deliver a Microsoft-Hortonworks developed implementation of Windows Server, which was known as HDInsight Server for Windows. Instead, Microsoft will be advising customers who want Hadoop on Windows Server to go with Hortonworks Data Platform (HDP) for Windows.
Windows Azure HDInsight is “100 percent Apache Hadoop” and builds on top of HDP. HDInsight includes full compatibility with Apache Hadoop, as well as integration with Microsoft’s own business-intelligence tools, such as Excel, SQL Server and PowerBI.
“Our vision is how do we bring big data to a billion people,” said Eron Kelly, Microsoft’s SQL Server General Manager. “We want to make the data and insights accessible to everyone.”
Making the Hadoop big-data framework available in the cloud, so that users can spin up and spin down Hadoop clusters when needed is one way Microsoft intends to meet this goal, Kelly said.
Microsoft and Hortonworks originally announced plans to bring the Hadoop big-data framework to Windows Server and Windows Azure in the fall of 2011. Microsoft made a first public preview of its Hadoop on Windows Server product (known officially as HDInsight Server for Windows) available in October 2012.
Microsoft made available its first public preview of its Hadoop on Windows Azure service, known as HDInsight Service, on March 18. Before that…
Continue reading here!
//Richard
#Microsoft to acquire #Nokia’s devices & services business
This is interesting, but I must admin that I’m not that surprised…
Microsoft to acquire Nokia’s devices & services business, license Nokia’s patents and mapping services
REDMOND, Washington and ESPOO, Finland – Sept. 3, 2013 – Microsoft Corporation and Nokia Corporation today announced that the Boards of Directors for both companies have decided to enter into a transaction whereby Microsoft will purchase substantially all of Nokia’s Devices & Services business, license Nokia’s patents, and license and use Nokia’s mapping services.
Under the terms of the agreement, Microsoft will pay EUR 3.79 billion to purchase substantially all of Nokia’s Devices & Services business, and EUR 1.65 billion to license Nokia’s patents, for a total transaction price of EUR 5.44 billion in cash. Microsoft will draw upon its overseas cash resources to fund the transaction. The transaction is expected to close in the first quarter of 2014, subject to approval by Nokia’s shareholders, regulatory approvals and other closing conditions.
Building on the partnership with Nokia announced in February 2011 and the increasing success of Nokia’s Lumia smartphones, Microsoft aims to accelerate the growth of its share and profit in mobile devices through faster innovation, increased synergies, and unified branding and marketing. For Nokia, this transaction is expected to be significantly accretive to earnings, strengthen its financial position, and provide a solid basis for future investment in its continuing businesses. Read more…
Are #Microsoft Losing Friends and Alienating IT Pros? – via @andyjmorgan, @stevegoodman
This is a great blog post by Steve Goodman!
Regular readers of my blog will know I’m a big fan of Microsoft products. As well as being an Exchange MVP, I’m very much a cloud fan – you’ll find me at Exchange Connections in a few weeks time talking about migrating to Exchange Online amongst other subjects. What I’m about to write doesn’t change any of that, and I hope the right people will read this and have a serious re-think.
Microsoft’s “Devices and Services” strategy is leaving many in the industry very confused at the moment.
If you’ve been living under a rock – I’ll give you an overview. They’ve dropped MCSM, the leading certification for their Server products. They’ve dropped TechNet subscriptions, the benchmark for how a vendor lets its IT pros evaluate and learn about their range of products. And they’ve been very lax with the quality of updates for their on-premises range of products, Exchange included, whilst at the same time releasing features only in their cloud products.
A range of MCMs and MCSMs – Microsoft employees included – have been expressing their opinions here, here, here, hereand in numerous other places. We’ve discussed the TechNet Subscriptions on The UC Architects’ podcast.
One thing is key – this kind of behaviour absolutely destroys trust in Microsoft. After the last round of anti-trust issues, it took a long time for Microsoft to gain a position of trust along with many years of incrementally releasing better and better products. A few years ago Microsoft was just about “good enough” to let into your datacentre; now it’s beginning to lead the way, especially with Hyper-V, Exchange and Lync.
Before I get started on Microsoft’s cloud strategy, let’s take a jovial look at what (from my experience) is Google’s strategy:
- Tell the customer their internal IT sucks (tactfully), ideally without IT present so they can talk about the brilliance of being “all in” the cloud without a dose of reality getting in the way.
- Class all line of business apps as irrelevant – the sales person was probably still in nursery when they were deployed. Because those apps are old, they must be shit.
- Show a picture of something old and irrelevant – like a mill generating it’s own energy. Tell them that’s what their IT is! You, the customer, don’t run a power station, so why would you run your own IT? If you do run your own IT you are irrelevant and getting left behind.
- Make out the customer’s own IT is actually less reliable than it is. Don’t mention that recent on-premises products cost less, are easy for the right people to implement and from a user perspective are often more reliable than an overseas cloud service.
- Only provide your products in the cloud so once you’re in… you’re in.
- Don’t let anyone from the outside be a real expert on the technology. You don’t need a Google “MVP”, because 99% of Google server products can only be provided by one company.
- Once you’ve signed up a customer remember, you don’t need to give them good support. They can’t go anyway without spending money on a third party solution to get their data out.
From a Microsoft MVP point of view, Google’s strategy is brilliant. It means that although we like a lot of their products, it drives away customers in their droves. Microsoft’s traditional approach to the cloud – and partner ecosystem would be a breath of fresh air to someone who’s been though the Google machine.
Unfortunately, based on recent experiences by myself and others – the above is actually looking pretty similar to Microsoft’s new strategy….
Continue reading here!
//Richard
#Gartner Magic Quadrant for Cloud Infrastructure as a Service – #IaaS
Market Definition/Description
Cloud computing is a style of computing in which scalable and elastic IT-enabled capabilities are delivered as a service using Internet technologies. Cloud infrastructure as a service (IaaS) is a type of cloud computing service; it parallels the infrastructure and data center initiatives of IT. Cloud compute IaaS constitutes the largest segment of this market (the broader IaaS market also includes cloud storage and cloud printing). Only cloud compute IaaS is evaluated in this Magic Quadrant; it does not cover cloud storage providers, platform as a service (PaaS) providers, software as a service (SaaS) providers, cloud services brokerages or any other type of cloud service provider, nor does it cover the hardware and software vendors that may be used to build cloud infrastructure. Furthermore, this Magic Quadrant is not an evaluation of the broad, generalized cloud computing strategies of the companies profiled.
In the context of this Magic Quadrant, cloud compute IaaS (hereafter referred to simply as “cloud IaaS” or “IaaS”) is defined as a standardized, highly automated offering, where compute resources, complemented by storage and networking capabilities, are owned by a service provider and offered to the customer on demand. The resources are scalable and elastic in near-real-time, and metered by use. Self-service interfaces are exposed directly to the customer, including a Web-based UI and, optionally, an API. The resources may be single-tenant or multitenant, and hosted by the service provider or on-premises in the customer’s data center.
We draw a distinction between cloud infrastructure as a service, and cloud infrastructure as atechnology platform; we call the latter cloud-enabled system infrastructure (CESI). In cloud IaaS, the capabilities of a CESI are directly exposed to the customer through self-service. However, other services, including noncloud services, may be delivered on top of a CESI; these cloud-enabled services may include forms of managed hosting, data center outsourcing and other IT outsourcing services. In this Magic Quadrant, we evaluate only cloud IaaS offerings; we do not evaluate cloud-enabled services. (See “Technology Overview for Cloud-Enabled System Infrastructure” and “Don’t Be Fooled by Offerings Falsely Masquerading as Cloud Infrastructure as a Service” for more on this distinction.)
This Magic Quadrant covers all the common use cases for cloud IaaS, including development and testing, production environments (including those supporting mission-critical workloads) for both internal and customer-facing applications, batch computing (including high-performance computing [HPC]) and disaster recovery. It encompasses both single-application workloads and “virtual data centers” (VDCs) hosting many diverse workloads. It includes suitability for a wide range of application design patterns, including both “cloud-native”….
Figure 1. Magic Quadrant for Cloud Infrastructure as a Service
Source: Gartner (August 2013)
Continue reading here!
//Richard
Today is the RTM for #Windows Server 2012 R2! – #Microsoft
Microsoft blog post about the RTM release of Windows Server 2012 R2:
![]() |
As noted in my earlier post about the availability dates for the 2012 R2 wave, we are counting the days until our partners and customers can start using these products. Today I am proud to announce a big milestone: Windows Server 2012 R2 has been released to manufacturing!
This means that we are handing the software over to our hardware partners for them to complete their final system validations; this is the final step before putting the next generation of Windows Server in your hands.
While every release milestone provides ample reason to celebrate (and trust me, there’s going to be a party here in Redmond), we are all particularly excited this time around because we’ve delivered so much in such a short amount of time. The amazing new features in this release cover virtualization, storage, networking, management, access, information protection, and much more.
By any measure, this is a lot more than just one year’s worth of innovation since the release of Windows Server 2012!
As many readers have noticed, this release is being handled a bit differently than in years past. With previous releases, shortly after the RTM Microsoft provided access to software through our MSDN and TechNet subscriptions. Because this release was built and delivered at a much faster pace than past products, and because we want to ensure that you get the very highest quality product, we made the decision to complete the final validation phases prior to distributing the release. It is enormously important to all of us here that you have the best possible experience using R2 to build your private and hybrid cloud infrastructure.
We are all incredibly proud of this release and, on behalf of the Windows Server engineering team, we are honored to share this release with you. The opportunity to deliver such a wide range of powerful, interoperable R2 products is a powerful example of the Common Engineering Criteria that I’ve written about before.
Also of note: The next update to Windows Intune will be available at the time of GA, and we are also on track to deliver System Center 2012 R2.
Thank you to everyone who provided feedback during….
Continue reading here!
//Richard
Microsoft is progressing quickly! – SkyDrive Pro updated to 25GB and improved sharing – via @BasvanKaam
I must say this once again, Microsoft looks to be on the right track when it comes to getting back as one strong supplier of services in the future/present “BYOD” world. As I wrote in my post #Microsoft – On the right track! – #Windows, #BYOD, #Citrix now Microsoft is actually targeting to solve many of the gaps that we see with today services for BYOx scenarios. For instance how to manage what you want on top of the device (Azure, Intune, SkyDrive, Work Folders etc…) in a controllable fashion and not a full managed device that costs you a fortune to manage… and ShareFile, Box and others are great solutions that have many features that SkyDrive doesn’t have. But there is one thing that they all lack (or please enlighten me!!):
Encryption at rest on Windows, OS X and Linux OS’s/distributions, here all providers are leaning on that you already have hard drive encryption like BitLocker etc. But who manages that then? Can you then say that your service is “BYOD-compliant”? I wouldn’t say so… It’s not only SmartPhones and Tablet devices that we loose… but here Microsoft and SkyDrive may be the first to come with encryption on at least Windows 8.1 devices and somewhat manageable…
But again back to the announcement from Microsoft and SkyDrive:
Microsoft announced today that it is giving business users more storage space and a better way to share files across multiple devices. As first reported by TechCrunch, through its SkyDrive Pro accounts, employees will now receive 25GB of storage to start out with, a sharp increase from 7GB — and even this capacity can be increased to 50GB or even 100GB. Additionally, using SkyDrive’s Shared with Me view, users can share files with their friends and co-workers securely and in real-time.
According to Microsoft Senior Product Managers Mark Kashman and Tejas Mehta, the new storage space limits will be available for both new and existing customers.
This certainly makes the service standout among its competitors, namely Dropbox and Box. It was only about a week or so ago when the latter heralded in the launch of a new pricing plan aiming to increase the number of small businesses using its service. For personal users, Box also wound up doubling the amount of free storage they received.
Here’s how you can figure out the overall storage for each user:
With Office 365, you get 25 GB of SkyDrive Pro storage + 25 GB of email storage + 5 GB for each site mailbox you create + your total available tenant storage, which for every Office 365 business customer starts at 10 GB + (500 MB x # of user(s)1).
While Dropbox, Box, and Hightail certainly are some of the popular services out there today, SkyDrive isn’t something to be trifled with either. Through its integration with the Surface, Windows Phone, and other Microsoft products, along with iOS and Android devices, it has the potential to be a very powerful service.
As for the new sharing feature, just like you would perhaps see in Google Drive or any other cloud storage service, SkyDrive Pro is now offering a Shared with Me view that lets you take a shared document and view, edit, re-share, download, and more — all as if it were in your own storage bin.
But Microsoft isn’t stopping there, as it is adding several minor, but interesting enhancements to SkyDrive. The company has also increased the overall file upload limit to its SharePoint Online service to 2GB per file. Files placed into the recycle bin will now remain…
Continue reading here!
//Richard









