Monday, 3 August 2015

Microsoft Windows 10 Review: The Best Operating System From Microsoft Ever!!!

New look, new apps, new browser.Here's what we think of Microsoft's new OS...

I have been testing Windows 10 for two months on my desktop and laptop, it is the best PC experience I had after Windows XP. The Start Menu is back with best of best features and even though it is familiar it is also fresh!!!
I liked the direction Microsoft is taking with Windows 10, accepting feedback and ideas from its customers and professionals along the way. It feels like best way to shape Windows into something which people love to use not what they are forced to.
Windows 10 delivers a refined, vastly improved vision for the future of computing with an operating system that's equally at home on tablets and traditional PCs.
This new Operating System combined the best of old and new windows feature making it best OS from Microsoft till date while correcting all mistakes it made in Windows 8.
What’s new?: Windows 10 fixes a lot of the irritations found in Windows 8 by improving the interface for desktop computers and laptops, including the return of the Start menu, a virtual desktop system, and the ability to run Windows Store apps within a window on the desktop rather than in full-screen mode.
Windows 10 is a universal platform that runs across all devices such as Windows Phones, surface tablets, servers, data centers and games consoles.


Adjustable Start Menu: Best Start Menu Till Date: The Start menu will default to a narrow column, but users can drag around the margins to their liking. Fans of "Live Tiles," those icons that quick launch apps, may want a broader canvass.



Spoken Reminders: Hit the mic icon in the search bar, and the digital assistant Cortana will listen for spoken commands. Cortana can then feed the relevant information directly into calendar, email, reminder and calculator apps. Try saying "Remind me to buy new watch tomorrow at 6 pm," and you'll get a sense of the possibilities.

"Hey Cortana:" Really chatty users can go into Cortana's settings and flip on "Hey Cortana." The digital assistant will then wake up at that very same voice command.
Notebook: Cortana follows your search and browsing habits in an attempt to decipher your personal tastes. Cut to the chase by hitting the notebook icon in Cortana's settings and filling out your preferences directly. More privacy minded users can also cut off Cortana's senses by hitting "Manage what Cortana knows about me in the cloud."
Refined Searches: The search bar embedded in the Start screen simultaneously searches your personal files and the web. For a tighter focus, you'll notice two buttons appear as you type a search term. One offers to search "My stuff," the other, the "Web." Select according to your needs. My stuff will show the files, folders inside your PC.
Forget-Me-Not Files: Can't remember the name of that PowerPoint deck? Enter the file type ".ppt" in the search bar, and it will pull up every saved PowerPoint file, sortable by relevance or recency. Ditto, Word docs and Excel spreadsheets.
Everything Runs in a Window: Apps from the Windows Store now open in the same format that desktop apps do and can be resized and moved around, and have title bars at the top allowing for maximize, minimize, and close with a click

Snap Enhancements: You can now have four apps snapped on the same screen with a new quadrant layout.
Windows will also show other apps and programs running for additional snapping and even make smart suggestions on filling available screen space with other open apps.
New Task View Button: There’s a new task-view button on the taskbar for quick switching between open files and quick access to any desktops you create.

Multiple Desktops: Create desktops for different purposes and projects and switch between these desktops easily and pick up where you left off on each desktop. This helps me a lot working on multiple projects at a time.

Find Files Faster: File Explorer now displays your recent files and frequently visited folders making for finding files you’ve worked on is easier. This is really very helpful feature for professionals and students who are working on multiple projects.

Continuum: For convertible devices, such as the Surface, there are two modes, tablet and desktop. When using the device as a tablet, Windows 10 will automatically change to tablet mode which is more touch-friendly. This means apps will run full screen and allow you to use touch gestures to navigate.
Once you connect a mouse and keyboard, or flip your laptop around, Windows will go into desktop mode. Apps turn back into desktop windows that are easier to move around with a mouse and you’ll see your desktop again.
Microsoft Edge Web Browser: I don’t remember when I used Internet Explorer last time, Thanks to all new Web Browser Edge, Yes Codenamed “Spartan”, Microsoft Edge is Microsoft’s new web browser replacing Internet Explorer. A lot more light weight than its predecessor and allows you to annotate and share these with others on the web. It is very fast, reliable, with loads of useful features which makes Edge as best web browser available today.

Reading List: The star icon in Microsoft Edge doesn't just add a webpage to your favorites list. You'll notice a second option to save a story to a "Reading List." The browser will then automatically save the headline, the picture and the link inside of a handy side menu, which slides out of view until you're ready for some heavy duty reading. It will help a lot for users who has limited availability to internet accessibility.

Marginalia: Microsoft Edge includes a pen and notepad icon in the upper left hand corner. Hit it, and Edge will convert the webpage into mark-up mode. Use digital ink, highlighters and text boxes to mark up the page. Use the share icon to email or save your web clippings.


New Calendar App: A new calendar app that allows you to link into your outlook calendar and your email address and now runs in a window rather than full screen.

New Mail App: The new mail app has had a facelift and allows you to link in with your outlook email or your Microsoft account email address.


Photos App: The photos app is a nice little way to organize your photos and works whether you are on a tablet, phone or desktop PC. It will import photos directly from your digital camera or on-board camera if you are using a tablet or phone.

Now we can also perform minor corrections and enhancements such as removing red-eye, lightening up a dark photograph and apply some simple effects such as sepia or black and white. To do this, tap or click on an image, this will open the image in view mode.

As you move your mouse or tap on the image a tool bar will appear along the top. This will give you some options to share a photograph via email or social media, see the image full screen. You can tap the magic want icon to perform some automatic adjustments such as brightness, contrast etc. You can also tap the pencil icon to do your own editing and photo enhancements. These features were much awaited in default Photo App from Microsoft.



Verdict: Windows 10 works well, and didn’t break any of my older Windows software. The launch is just the start. Microsoft intends to continuously upgrade it over time, which the user has no choice about as you can’t turn updates off without becoming unsupported. There is a Microsoft tool to hide or block unwanted driver updates, however. The great news is you will not need any additional device driver if you already have drivers for previous version of Windows.

 Posted By:






Friday, 29 August 2014

What IT Leadership Can Learn From Manufacturing.

KansasSo many IT leaders realize their world is becoming a different place, and fast.  You can see it in their faces, hear it in the tone of their voices — almost feel the anxiety.

Like most leaders, they often go looking for examples from others who are adjusting well to their new realities. 
While there is plenty to learn from their peers, I usually counsel that understanding how modern manufacturing has changed (and continues to change!) provides ample lessons and tools about how to think about the modern IT organization.

One things for sure, there’s no going back to Kansas anytime soon …
A Wealth Of Parallels
At a fundamental level, manufacturing is about creating value-add around physical goods.  One could make an argument that IT (and computing in general) is about creating value-add around information.

GlobeBoth manufacturing and IT face somewhat similar constraints: the cost of capital, labor, limits in technology, unpredictable demand, long supply chains, and much more.

Both find themselves aggressively competing for their customers. 
Both are continually figuring out their unique value-add: what things do we do for ourselves, and what things do we leave to others to do more efficiently? 
Both have to continually re-invent their model, otherwise risk falling behind.

For those of you who work at companies with a strong manufacturing component, there’s a wealth of experience and perspective waiting to be tapped by the IT team.  For the rest of you, there is plenty of material readily available on how modern manufacturing is practiced.

I’d encourage you to invest the time.

A Brief History?

Thanks to Wikipedia, it’s not hard to get a sense of how manufacturing evolved.  It started with individual artisans — craftspeople — and then evolved into highly structured guilds.

GuildRemember that “guild” concept the next time you interact with your database, network or security team :)

The advent of better power sources and transportation changed manufacturing from a local industry to a global one where scale mattered.  Human hands gave way to increasing levels of automation.  The traditional guilds were replaced new models, and new skills.
All somewhat reminiscent of what the microprocessor, the internet and “cloud” is doing to enterprise IT.

Over the last few decades, the pendulum in some manufacturing sectors appears to have swung from mass efficiencies to mass customization: valuing flexibility, agility and responsiveness over ultimate efficiency.

RMS_schematicIf you’re curious, check out this short piece on Reconfigurable Manufacturing Systems, circa 1999.  The idea is simple: physical manufacturing assets should be under software control, and completely reconfigurable based on changing demands.

This should sound vaguely familiar to many of you …

3d_printedNo discussion would be complete without acknowledging the advent of 3D printing — transforming yet another labor and capital intensive component of manufacturing into something that is entirely under software control.

One could justifiably say that — when it comes to modern manufacturing — it’s quickly becoming all about the know-how that’s implemented in software.

Back To IT

Recently, I was reading an analyst’s survey finding that SDDC concepts— software-defined data center - had been strongly adopted by about a third of the participants.  The remainder either weren’t quite sure or saw themselves going in a different direction.   I'm not exactly sure what that different direction might be …

SDDCAs a VMware employee, you might think I would see the findings as potentially negative news.  Quite the opposite, I was gratified to see that a third of the participating senior IT leaders understood SDDC concepts and saw themselves moving in that direction.

To be fair, the concepts have only been around for a relatively short period, and the supporting technologies (beyond compute, that is) are now just entering the marketplace. 
Combine that reality that with the unavoidable fact that the entire IT (manufacturing?) organization has to be re-envisioned around how information services are sourced, produced and consumed in an SDDC model — and I’m impressed.

A Bigger Picture

Cloud_blueI’ve often argued that our society is quickly evolving to an information economy.  All businesses will be information businesses before long — if they’re not today.

Just as manufacturing played a central role in previous business models (and still does today), information and the supporting IT functions will continue to increase in prominence.

These IT factories will need new technology blueprints to be efficient, agile and responsive.  That’s what I see in SDDC — and I guess I’m not alone.

And there is plenty to be learned from how it’s done in the physical world.
 

Loose Your Data: Loose your business.

Crater
Another unpleasant aspect of our new “information economy”.

A promising young start-up (Code Spaces) was held up for ransom by an intruder who broke into their AWS account and took control.  The digital kidnapper wanted a payoff, or else …
A sad posting says that — basically — all their customer’s data is gone, and they’re done for.  That’s it.  There’s no coming back for them.  Not to mention the pain inflicted on their trusting customers.
In a not entirely-unrelated story, in the US, the IRS (the tax agency) is in serious hot water because they can’t produce emails in the context of a congressional investigation.  The excuse?  The emails were on a personal hard drive (??), which failed, and has long been disposed of.
While the IRS is not out of business (after all, they’re a government agency), they’re certainly seriously impacted by the incident, making doing business more difficult.  No, I’m not going to try and claim the same with my personal tax records …
And with every tragedy, there are lessons to learn.
Do You Have A REAL Backup?
Safe
IT professionals know that a REAL backup is one that’s completely separate and isolated from the original data source as many ways as possible: separated logically, separated physically, stored on different media technology, different access credentials, etc.

The more kinds of separation, the better the protection.
I have taken ridicule for this position before (e.g. people who consider a simple snap a backup), but I’ll stand my ground.  All those snapshots aren't doing Code Spaces much good now, are they?
If losing data permanently and irretrievably would be an unmitigated disaster, then extra precautions are needed.
What’s Changing
Burglar
What’s become popular recently is a new breed of “digital kidnapper” — someone who extorts ransom to avoid the loss of your data.

We all know (or should know) about the recent spate of malware that encrypts your personal hard drive.  If you derive your livelihood from your personal computer (as many of us do), this can be a life-altering experience.
If you didn’t have religion about real backups before, you’ll certainly have it now.
The Cloud Angle
Code Spaces appears to have run entirely on Amazon’s AWS — primary data, backups, etc.  In my book, that’s dangerous — if AWS has a bad day, you have an even worse day.  And everyone has a bad day, sooner or later.
Cloud_lock
All access was through their control panel. The bad guy got access, and he was in business. Not being deeply familiar with AWS, I’m now very curious about how access control is set up for AWS’ control panel.

An awful lot of valuable data is stored there — think of it as a huge bank — and one now has to ask questions to see if it could happen again, and what steps would be necessary to prevent that.
A related question: was there anyone at AWS they could have contacted to help out?  Amazon’s model is highly automated; when a customer has a crisis of this magnitude, I would guess they’re not set up to respond quickly, if at all.   The service did what it was designed to do.
In hindsight, if Code Spaces had been making simple lazy copies to anything else — a home computer, a server elsewhere, etc. — the effects of the attack could be somewhat mitigated.  They’d be in business, after a stretch.
That’s the value of a real backup: when something bad happens, you’re injured, but you’re not dead.
Shifting Tides
Papertrail4
Not all that long ago, most business processes ran on a combination of paper and digital.  If the computer lost data, you could always go back to paper records, and attempt to recreate things.

Not anymore.  There’s no paper trail.  Lose the data, it’s gone.  Although, in the case of the IRS, I bet those emails are somewhere :)
Information is the new wealth, the new repository of value.  That’s going to attract bad guys — if not for IP theft, then for ransom attempts.  
Just like you can get your bank account cleaned out, you can get your cloud account cleaned out — with similar disastrous impacts.
This is not a criticism of clouds, or AWS, or anything else — just that the world has changed, and we must think and act differently to protect our information.

Policy Based IT: The Next IT Frontier.

Next_frontier
Several years ago, it became clear to me that the next aspirational model for enterprise IT was “IT as a Service”, or ITaaS.    

At its core was a simple yet powerful idea: that the core IT operational model should be refashioned around the convenient consumption of IT services.  
Under the ITaaS model, most everything IT does is now presented as a variable service, marketed to users, with supply driven by the resulting demand. 
IT becomes the internal service provider of choice.
Now, several years later, that once-controversial idea has clearly grown deep roots, with many examples of progressive IT organizations embracing this perspective.   Some have made the transition, some are mid-journey, others have yet to begin.  The IT world has moved forward.
So, it’s fair to ask — what might come next?  I have a strong suspicion as to what the next operational model will be.

When it comes to continually improving IT productivity, automation is that lever.  It's the gift that keeps on giving when it comes to IT outcomes.  Progressively improved automation means progressively improved capex and opex efficiency, fewer errors, more responsive reactions — done right, everything gets better and better.
It’s not just an IT thing: you’ll seem the same continuing automation investment patterns in manufacturing, logistics, consumer marketing —  any endeavor where core processes are important.
Baker City Telephone Operators c1910 FSDM2
Broadly speaking, there are two approaches to how one goes about automating IT.  Many think in terms of bottoms-up: take individual, domain-specific repetitive tasks, and automate them — perhaps in the form of a script, or similar.  

The results are incremental, not transformational.
During the early days of telephony, switchboard operator productivity was limited by the reach of the operator’s arms.  Someone came up with the idea of putting wheels on the chairs.   Clever, but only modest productivity gains resulted — what was needed was a re-thinking of the problem at hand.
We’ve got the same situation in IT automation: we’re not after mere incremental improvements, what we really want is a sequence of order-of-magnitude improvements.  And to do that, we need to think top-down vs. bottom-up. 
Starting At The Top
Since IT is all about application delivery, applications logically become the top of the stack.  Approached that way, automation becomes about meeting the needs of the application, expressed in a manifest that we refer to here as “policy”.   
Application_centric
We need to be specific here, as the notion of “policy” is so broad it can conceivably be applied almost anywhere in the IT stack, e.g. what rebuild approach do you want to use for this specific disk drive? 

Indeed, listen to most IT vendors and you’ll hear the word “policy” used liberally.   To be clear, policies can nest — with higher-level policies invoking lower-level ones.
For this conversation, however, we’re specifically referring to top-level policies associated with groups of applications.  
The Big Ideas Behind (Application) Policy
The core idea behind “policy” is simple: policies express desired outcomes, and not detailed specifications for achieving that outcome.  Policies are a powerful abstraction that has the potential to dramatically simplify many aspects of IT operations.
Outcomes
Speaking broadly, application policies could address three scenarios: normal day-to-day operations, constrained operations (e.g. insufficient resources), and special events (e.g. an outage, software updates, maintenance windows, etc.) 

In addition to being a convenient shorthand to expressing requirements, policies are also an effective construct to manage change.  When application requirements shift — as they often do — a new policy is applied, which results in the required changes being cascaded through the IT infrastructure.   Or perhaps model what a change in policy might do. 
Finally, compliance checking — at a high level — becomes conceptually simple.  From controlling updates to monitoring service delivery: here is what the policy specifies — is it being done?  And if not, what is needed to bring things into compliance?   
You end up with a nice, unambiguous closed-loop system.
Closed_loop
Stepping outside of IT for a moment, we’ve all probably had personal experience in organizational policies being handed down from above: travel policies, hiring policies, etc.  Not a new idea.

The ones that work well seem to be the ones that outline broad objectives and provide guidelines or suggestions.  The ones that seem to cause friction are the ones that are overly specific and detailed, and hence constraining.
Simple Example #1
Let’s take an ordinary provisioning example of an application.  At one level, you can think of a policy as a laundry list of resources and services required: this much compute, memory, storage, bandwidth, these data protection services, this much security, etc.   
So far, so good.  Our notion of policy is focused more on what’s needed, rather than how it’s actually done.   But, since we’re presumably working against a shared pool of resources, we have to go a bit further, and prioritize how important this request might be vs. all other potential requests.
Not-important
Let’s arbitrarily designate this particular application as “business support”.  It’s somewhat important (isn’t everything?) --  but not as important as either mission-critical nor business-critical applications.

It needs reasonable performance, but not at the expense of more important applications.  It needs a modicum of data protection and resiliency, but can’t justify anything much more than the basics.   It has no special security or compliance requirements, other than the baseline for internal applications.  
The average large enterprise might have hundreds (or perhaps many thousands) of applications that fall into this category.
Under normal conditions, all requests using this policy are granted (and ideally paid for) as you’d expect.  But what if resources become constrained? 
Yes, your "business support" application will get the requested vCPUs and memory, but — if things get tight — it may not get what you wanted, perhaps temporarily.  Here’s the storage requested, but if we come up short, your application may be moved to cheaper/slower stuff and/or we’ll turn on dedupe.   Here’s the network connectivity you requested for your app, but …  you get the idea.
Our expanded notion of application policy not only can be used to specify what’s required, but also how to prioritize this request against other competing requests for the same shared resources.  Why? Resource constraints are a fact of life in any efficiently-run shared resource (or cloud, if you prefer). 
Ignore
Let’s take our idea a bit further.  The other scarce resource we can prioritize using this scheme is “IT admin attention”.  Since our business support application isn’t as critical as others, that implies that any errors or alarms associated with it aren’t as critical either.    

What about the final situation — a “special event”, such as hardware failure or software upgrade?   No surprise — lower priority.  
Just to summarize, our notion of application policy not only addressed the resources and services desired at provisioning time, but also gave guidance on how to prioritize these requests, how closely it should be monitored, how tightly the environment needs to be controlled, etc.
All in one convenient, compact and machine-readable description that follows the application wherever it goes.
Now To The Other Side Of The Spectrum
Let’s see how this same policy-centric thinking can be applied to a mission-critical application.  
Mission-critical2
Once again, our application policy has specified desired resources and services needed (compute, memory, storage, bandwidth, data protection, security, etc.) but now we need to go in the other direction.

If this particular mission-critical application isn’t meeting its performance objectives, one policy recourse might be to issue a prioritized request for more resources — potentially at the expense of less-critical applications.  Yes, life is unfair.
When it comes to critical services (e.g. high availability, disaster recovery, security, etc.) we’d want continual compliance checking to ensure that the requested services are in place, and functioning properly.
And, when we consider a “special event” (e.g. data center failure, update, etc.), we’d want to make sure our process and capabilities were iron-clad, e.g. no introducing new software components until testing has completed, and a back-out capability is in place.
But Isn’t That What We’re Doing Today?
Yes and no. 
We tend to naturally think in terms of classes of services, prioritization, standard processes, etc.  That’s typical.  And, certainly, we're using individual policy-based management tools in isolated domains: security, networking, perhaps storage and so on.
Top_down_bottom_up2
What’s atypical is the top-down automation of all aspects of IT service management, using a centralized application policy as a core construct to drive automation.

With this approach we don't have to limit ourselves to a few, impossibly broad policy buckets, like "business critical".  We can precisely specify what each and every application might need, separately and independently.  
It seems to be a truism that IT spends 80% of their time on 20% of the applications -- mostly because their requirements are unique.  
Application policy can easily capture -- and automate -- the 'exceptions" to standard buckets.
Taking The Next Step Forward — Hard, or Easy?
Next_step
When I consider the previous transformation from typical silo-oriented IT to ITaaS it’s often proven to be difficult and painful.  When you undertake to change IT’s basic operating model, it demands strong, consistent leadership. 

It’s not just that new approaches have to be learned, it’s just that so much has to be unlearned.   
And the ITaaS transformation isn’t just limited to the IT function.  Not only does IT need to learn how to produce differently, the business also needs to learn how to consume (and pay for services) differently. 
But for those who have already made this investment — and you know who you are — the next step to policy-based automation is comparatively easy.  Indeed, in many ways it will be a natural progression, resulting from the need for continually improving and impactful automation.  
To achieve this desirable outcome on a broader scale, there are more than a few hurdles to consider.
First, all participants and components in a policy-driven IT environment need to be able to react consistently to external policy. 
This, in many ways, is software-defined in a nutshell.   Indeed, when I'm asked "why software defined?" my knee-jerk response is "to better automate".
Servers need to react.  Networks need to react.   Storage needs to react.  Data protection and security and everything else needs to react.  All driven by policy.
Policy responses can’t be intrinsic to specific vendor devices or subsystems, accessed only using proprietary mechanisms.   Consistency is essential.  Without consistency, automatic workflows and policy pushes quickly become manual (or perhaps semi-automated), with productivity being inherently lost.   
In larger enterprise environments, achieving even minimal consistency is no trivial task.  Hence the motivation behind software-defined. 
Second, serious process work is required to formally document actionable policies in machine-readable form.  So much of IT operations is often tribal knowledge and accumulated experience. 
As long as that knowledge lives in human brains — and isn’t in machine readable form — automation productivity will be hampered.
Third, the resulting IT organization will likely be structured differently than today, overweighted towards all aspects of process: process definition, process measurement, process improvement — just as you would find in non-IT environments that invest heavily in automation.
And any strategy that results in refashioning the org chart brings its own special challenges.
Creating That End State Goal
Lighthouse
So much of leadership is painting a picture for teams to work towards.  When it comes to IT leadership, I think policy-based automation needs to be a component of that aspirational vision. 

A world where virtually all aspects of IT service management are driven by application-centric, machine-readable policies.  Change the policy, change the behavior.
The underlying ideas are simple and powerful.  They stand in direct contrast to how we’ve historically done IT operations — which is precisely what makes them so attractive.
And their adoption seems to be inevitable.

Ubuntu 12.04 vs. Windows 8: Five points of comparison

Ubuntu 12.04 vs. Windows 8: Five points of comparison

The leading Linux desktop and the number one desktop of all, Windows, are both undergoing radical transformations, but which will be the better for it?
Windows 8 Metro vs. Ubuntu 12.04 Unity
Windows 8 Metro vs. Ubuntu 12.04 Unity
2012 has already seen a major update of what's arguably the most important Linux desktop:Ubuntu 12.04 and we're also seeing the most radical update of Windows with Windows 8 Metrocoming since Windows 95 replaced Windows 3.1. So, which will end up the better for its change?
1. Desktop interface
Ubuntu replaced the popular GNOME 2.x interface with Unity when their developers decided the GNOME 3.x shell wasn't for them. Some people, like the developers behind Linux Mint, decided to recreate the GNOME 2.x desktop with Cinnamon, but Ubuntu took its own path with Unity.
In Unity's desktop geography, your most used applications are kept in the left Unity Launcher bar on the left. If you need a particular application or file, you use Unity's built-in Dash application. Dash is a dual purpose desktop search engine and file and program manager that lives on the top of the Unity menu Launcher.
Its drawback, for Ubuntu power-users, is that it makes it harder to adjust Ubuntu's settings manually. On the other hand, most users, especially ones who are new to Ubuntu, find it very easy to use. Canonical, the company behind Ubuntu, has made it clear that regardless of whether you use Ubuntu on a desktop, tablet or smartphone the Unity interface is going to be there and it's going to look the same.
Windows 8 Metro is, if anything, even more of a departure from its predecessor than Unity. At least with Unity, you're still working with a windows, icons, menus, and pointers (WIMP). Metro has replaced icons with tiles. In addition, by default, you can only work with applications in tiles or in full-screen format. Even such familiar friends as the Start button are missing.
I've been working with Metro for months now. After all that time, I still think Windows 8 with Metro will be dead on arrival. Even people who really like Metro say things like "the default presentation is ugly and impersonal." You can make Metro a lot more usable, but that's a lot of work to make an interface that's already ugly prettier and, when you're done, you're still left with an interface that doesn't look or work the way you've been using Windows for years.
True, there's also the Windows 8 Desktop, which still doesn't have a Start button, but otherwise looks and works like the Windows 7 Aero interface, but it's a sop to users who don't want Metro. Sooner rather than later, Microsoft wants everyone on Metro. Of course on some platforms, such as Windows RT, the version of Windows 8 for ARM tablets, Metro is the only choice.
2. Applications
For ages one of the bogus raps against desktop Linux has been that there hasn't been enough applications for it. That was never true. What Linux didn't have was the same applications as Windows. To an extent, that's still true. You can't still get say Quicken, Outlook, or Photoshop natively on Linux. Of course, with the use of WINE and its commercial big brother Codeweaver's Crossover, you can run these, and other Windows programs, on top of Linux.
On the other hand, I find some Linux programs, such as Evolution for e-mail, an optional program in Ubuntu, to be far better than their Windows equivalents. In addition, if like more and more people these days the program you really use all the time is a Web browser for everything then Windows has no advantage what-so-ever. Chrome, as my testing has shown time and again, is the best Web browser around runs equally well on Ubuntu and Windows. On both, however, you'll need to download it. Ubuntu defaults to using Firefox and Windows 8, of course, uses Internet Explorer.
What I find really interesting though is that Microsoft is actually removing functionality from Windows 8. If you want to play DVDs on Windows 8 or use it as a media center, you'll need to pay extra. DVD-players and the power to stream media remain free options in Ubuntu and most other Linux distributions.
3. Security
There has been a lot of talk lately about malware on Macs and it's true. Macs are vulnerable to security breeches. So, for that matter, are Linux systems. But never, ever forget that for every single Mac virus or worm, there have been thousands of Windows attackers. And, that while Linux can be attacked as well, in practice, it' more secure than either Mac OS X or Windows and there has never been a significant Linux desktop security worm.
Could it happen? Sure. But, get real, I do run Linux with virus protection, ClamAV, but I'm paranoid, and even so I've never seen a single attacker, much less suffered a successful attack, in almost twenty years of using Linux desktops. I wish I could say the same of my Windows systems.
4. Total Cost of Ownership (TCO)
Thanks for Active Directory (AD), it's long been easy to manage Windows desktops, but then thanks to Lightweight Directory Access Protocol (LDAP) and tools like Landscape, it's no problem in Ubuntu Linux either. Indeed, since you won't be able to use AD to manage Windows RT systems, Ubuntu Linux actually provides a more unified management system.
Also, remember what I said about security? You can't forget anti-virus software or patching Windows for a minute. Linux? Yes, you should use anti-virus programs and patch regularly, but relax, you're not asking for zero-day doom all the time the way you are with Windows. Besides, the upfront cost of Linux? Zero. Windows 8? We don't know yet, but we do know that Windows 8 PCs will be more expensive than their Windows 7 brothers.
If you're really serious about cutting your desktop costs, Linux is the way to go.
5. Ease of use
One of the perpetual myths about Linux is how hard it is to use. Oh really? Don't tell my 80-year old Ubuntu-using mother-in-law or Jason Perlow's Linux user mom-in-law. They're both using Ubuntu 12.04 and loving it. Why? Because it's so easy to use.
Metro, on the other hand... well you know I don't like it, but I think it's telling that a Bing search-not Google, Bing-showed 3.32-million results for "Windows 8 Metro sucks." Many users, including our own Scott Raymond, would like it if Microsoft gave users the option to turn Metro off. That's not going to happen.
Another plus for Ubuntu is, say you really can't stand Unity. No problem, you can switch to GNOME 3.x, Cinnamon, KDE, whatever. With Ubuntu while they want you to use Unity, you can choose to use another Linux desktop interface. With Windows 8, you're stuck with half-Metro and half-desktop.
Put it all together and what do you get? Well, I don't see Ubuntu overcoming Windows on the desktop. There are just too many Windows users out there. The Linux desktop will never catch up with it.
My question though wasn't who was going to end up the most popular desktop. It was "which will end up the better for its change?" To that question, there's only one answer: Ubuntu is the winner. I foresee Windows XP and 7 users sticking to their operating systems and giving Windows 8 the same cold shoulder they gave Vista and Millennium Edition.
That will end up being a real problem for Windows. Back in the day, their iron-grip on the desktop meant they could have flops and still not lose much. Today, though, we're moving away from the desktop to a world where we do much of our work on the cloud and for that we can use tablets and smartphones as well. And, on tablets and smartphones, Microsoft has yet to show that Windows can play a role. Thanks to Android, we already know Linux is a major player on those, and Ubuntu is already making a desktop/Android smartphone partnership play.
All-in-all, Ubuntu is going to be far more successful for its changes than Microsoft will be with its operating system transformations.