Showing posts with label Cloud. Show all posts
Showing posts with label Cloud. Show all posts

Saturday, 13 August 2011

Choosing an IT Cloud Services Provider - Internal Or External - Top Executive Considerations


Whether you are an executive of a global enterprise or a business owner with a small IT team of at least 2 people, the pressure is on. Your organization needs to cut costs and boost service levels. You want to be a champion for your team -- to be a leader in maximizing rewards and minimizing risks. To do so, you must be thoughtful about focusing your internal resources on core competencies - the key drivers that most greatly differentiate your organization from others in your industry. And, you must be honest about when and where it makes sense to apply leverage using outside resources and/or new technologies or processes to give your organization the greatest advantages - the best foundation for supporting your core competencies.

Once you accept this mission, how do you execute it successfully?

The following brief presumes you have already identified suspect activities that merit further scrutiny regarding considering internal and/or external service providers. After this shortlist is identified, it is time to consider alternatives for the scope of services that you may want from service providers.

Preparing services scope alternatives

The extent of services and their comprising activities that are to be included in providers' proposals are called the scope of work, or scope. Consider that there are many alternatives, ranging from all-encompassing holistic scope down to scope which is highly sub-divided into key activities and/or augmentations. Key variables to consider are:

o Who is Responsible? Which provider has primary, secondary and/or tertiary responsibilities for each activity, depending on the degree of contribution and/or impact of their actions?

o Support Frequencies? How often will services be required? For example: One-time, Periodic (Weekly, Monthly, Quarterly, Annually), and/or Stream (on-going)

o Where Operated? At which site(s) will services be provided? Internal Site(s); Vendor Site(s); 3rd party Site(s)

o Key Service Levels? What are the key performance and/or timeliness metrics that will govern the services? Are remedies (including penalties) expected if actual results fall below agreed levels? Are incentives available for over-achieving targets? Remember to keep a good balance -- as few and as simple metrics as possible to allow assuring performance and enabling flexibility.

o Burdened adders? External service providers must effectively include costs in their proposals that may have been missed by internal service providers in their proposals. Due to such oversights, significant benefits have been lost in leveraging external resources. For successful comparisons, you must look beyond only employee salaries. Be complete in considering all burdened cost adders that will impact your organization, such as: overtime, benefits, management, training, support, facilities, furniture, computers, communications, administration, corporate allocations and other important operating and/or capital costs. Consider where to 'move' burdens when thinking about services scope alternatives, to place these to maximize your organization's advantages.

You can easily see how providers' proposal costs will vary dramatically depending on these and other key variables. Ideally, your cost accounting should align in a manner allowing clear and equitable comparisons. This will save you lots of time in the long run. You also need to be thoughtful in comparing service providers' proposals to assure you understand these nuances for each proposal.

Prudent, knowledgeable service providers will shy away from scope alternatives where they cannot execute effectively or efficiently enough to assure success, and/or only agree to 'reasonable efforts' towards service level targets until they have time to understand the environment and apply their value to it.

Service providers will also shy away from scope alternatives that have too much risk for expending unplanned resources to deal with problems originating from scope beyond their responsibilities -- even if there are seemingly clear terms for charging for additional time and materials. The big challenge here is in view of technology, process and people inter-dependencies. In the real world, it can take much time and expense to successfully diagnose problems and attribute responsibilities. As a result, experienced service providers may avoid such scope alternatives entirely. Be mindful of service providers who are not cautious about this, since it may indicate their proposals contain higher levels of risks for you.

From a risk assessment and mitigation standpoint, thoroughly think through your services scope alternatives presuming finger-pointing scenarios could occur between providers. Which scope alternatives avoid or minimize the chance of these scenarios? Which scope alternatives minimize and/or quickly resolve unwanted impacts and costs if finger-pointing does occur?

Another area requiring thought is identifying volume drivers that closely correlate to activity costs. All good service providers - internal or external - must have identified the key variables or drivers for managing and charging for the resources they deliver. Forecasts for these volumes will need to be agreed upon, ideally in a manner that aligns with your organization's business growth scenarios. Billing and/or charge-back plans will most likely depend on these estimates.

As you prepare your scope alternatives, also consider the timing of what is needed. What scope do you envision needing in the next 3- to 12-months, and beyond? What scope additions or subtractions may be required in the future -- say in the next 12- to 36-months?

Explore what is happening in industry via industry associations, your network of colleagues, benchmarking and/or consultants. These are among the strategies for gaining and/or confirming the good and not-so-good outcomes before you act.

After completing all the work described above, you will be better positioned to understand and communicate the scope of work alternatives being considered, and to more easily and quickly compare internal and external service providers' proposals.

In this business era, an important approach towards achieving significant benefits is by working with internal and/or external service providers who are embracing Cloud Computing concepts. If they are not being proactive about Cloud Computing, then will they be prepared for the dynamic changes that have already started?

Cloud Computing - Why the Hype?

A good definition of and introduction to Cloud Computing is provided by Wikipedia. Just type "Cloud Computing".

My view of Cloud Computing is that it is a paradigm for intrinsically incorporating hardware + software + communications + operations + other technologies together as an integrated holistic solution.

Typically, the following benefits are expected with Cloud Computing:

o Dramatic CapEx and OpEx savings & quick ROI versus more 'traditional' approaches -- much less spend

o Dramatically better scalability, flexibility, accessibility, performance, etc -- much better "bang"

o Dramatic alignment of service levels & related costs to user-groups -- better alignment & control

I have observed all these benefits, including conservative savings as high as 55%. The perceived savings may have been even higher if the comparisons were truly equitable. With 30% to 55% savings... plus much-enhanced service levels and alignment... Cloud Computing solutions easily command executives' attention. Hence, there is a lot of focus and discussion about cloud-based solutions.

The challenge with Cloud Computing is there are many inter-dependencies and nuances that must be considered with these approaches. Different vendors are taking different paths, as they evolve their existing offerings to be increasingly cloud-oriented. Some of these offerings are still relatively immature. Industry associations are working to establish standards and common vernaculars regarding technologies and processes - I'd say the telecommunications industry is leading in this regard. However, some changes that are occurring across all industries are still quite dramatic and sometimes not without bugs. Prudent executives are cautious about making investments where there are higher chances for change and therefore where risks or mis-investments are greater.

However, there are very practical (low-risk) opportunities enabling executives to act now with strong ROI and service level advantages -- either via internal providers leveraging cloud-based technologies and/or with the help of external providers.

Are you prepared to compare internal and external providers?

Do you understand your organization's:

o Top suspect activities to be scrutinized?

o Scope Alternatives?

o Accurate costs - and related volumes and forecasts?

o Target benefits?

o Possible added costs and risks?

If you have this understanding, then you are ready to engage in comparing internal and/or external service providers.

The high degree of complexity, extreme cost competitiveness, expectation of high service levels and dynamic change is forcing key transformations to occur. For you to be successful, as your organization's champion, you must apply leverage - taking advantage of scale and expertise -- when and where it makes sense. However, there are many providers from which to choose. How will you compare these internal and external providers?

A Trusted Provider - Look for their 'Deeds to Speak'

Like the old adage, "deeds speak louder than words", whether it's an internal or external provider, it's up to you to see through the marketing, sales, good intentions and other less-well-intended smoke to focus on their deeds -- their behaviors.

Of course, tangible savings are a large part of your decision. Most executives look for at least a 20% reduction in overall related expenses per annum. I have personally led selling services projects which delivered as much as 30% and upwards of 55% savings in expenses during the first year alone -- while also delivering significant service level advantages. I'd say that huge competitive advantages were delivered. My clients were able to re-mix internal resources to focus on core competencies that drove greater client-visible differentiations. Also, clients avoided adding resources and/or saved their existing high-contributor resources from leaving or cracking under the huge pressure being endured as incremental pressures had built-up over time. By the way, high levels of overtime and/or increasing frequencies and magnitude of service level impacts are key indicators of burn-out conditions, and these are important quantitative costs to consider as well.

However, what about the 'intangible' parts of your decision? There are vital clues about the relationship cost you may experience with your internal or external service provider. These are important aspects to consider - certainly pertaining to risk mitigation and which may sooner or later significantly impact tangible results. This is especially important when the quantitative aspects of different service providers' proposals match up in a similar manner.

What follows is a brief sketch of intangible qualities - the key behaviors -- to consider and compare.

The key objective - assess your prospective providers. Are they trustworthy for delivering game-changing advantages for today, tomorrow and beyond? Key indicators are described below.

A continuous commitment to excellence -- including integrity, best value, flexibility and responsiveness / proactivity. What is the provider's corporate conscience? Several key behaviors to observe are:

o Do their deeds consistently match their words?

o Do they demonstrate a consultative approach?

o Do they offer proof points of proactive industry leadership?

Thought Leadership. Does the provider demonstrate that they grasp the direction of business and technology? In the cloud computing era, if they are not focused on the future, how will they continue to be at least competitive and at best industry-leading in the coming months and years as dynamic changes occur? Several key behaviors to observe are:

o Solid understanding of current and emerging concepts.

o Do they have cross-silo (hardware, software, communications, services, etc) connections with industry leaders?

o Connection with industry changes. What are their perspectives?

Practical Leadership. Even more important than thought leadership, how well does the provider execute? Do they deliver practical excellence in savings and service levels - today? Are they proactive, or only respond when challenged? Several key behaviors to observe are:

o Multiple IT vendors are supported (hardware, software, communications, services)

o Breadth of services and depth of functional know-how that they deliver.

o Security and aligned segregation of duties and/or information.

o Current clients in the same disciplines you are considering. Or, in cases of new offerings, what is their track record for other clients? Ideally, client referrals. Ideally, from well-recognized, notable clients.

o Industry recognition for performance and/or support.

o Quick and disciplined processes that incorporate quality assurance.

Quickness and ease of doing business. An important sub-set of practical leadership is how quick and easy is it to do business with the provider and/or for the provider to adapt to changing needs and/or competition. Several key behaviors to observe are:

o Flexible scope options: support frequency, who runs the infrastructure, where does the infrastructure reside. o Flexible support options (worldwide, country, region, local)

o Flexible Quality of Services that align with your user groups' requirements.

o Flexible menus and charges.

o Consistent governance and billing / charge-backs.

o Attentiveness that you will receive. What percentage of the provider's business would this project represent? o Transitions - Time and Costs - during both start-up and wind-down phases.

Key Personnel. If the provider's key people are not strong in exhibiting good behaviors, then how will their team continue to stay the course in delivering a competitive edge? Assess the following people:

o Leadership Management

o Solution champion ('seller')

o Architect / Solution Designer

o Account Executive / Delivery Executive

o Consultants and Delivery Specialists

o Project Manager

There are many behaviors to observe. Those that are described above are only a short list. In general, any warning signs that pop up of course merits additional exploration. On the other hand, providers who demonstrate good behaviors in all these areas deserve additional consideration. As a top executive, you will have to apply thoughtful judgment for weighting the quantitative and qualitative characteristics in a manner that best suits your organization's goals.

It's time for you to take action

In the cloud computing era, you must evaluate how to optimize and best align your organization to differentiate its offerings to clients while taking the best advantage of leveraging internal and external service providers' scale and expertise. To be most successful, you need to consider more than quantitative savings - as vital as these are. You need to also consider how your organization's providers' will perform over time.

The most successful internal and/or external service providers for today and tomorrow will likely be those who demonstrate key qualitative behaviors. The behaviors that should resonate through their leaders, key personnel and throughout their whole team include: commitment to excellence, thought leadership, practical leadership, and ease of doing business. These behaviors are described more fully in the above brief. As the executive champion for your organization, observing these behaviors will give you a vital edge in thoughtfully comparing service provider alternatives and in facilitating the best decisions.

About The Author:




Lance Gattoni invites enterprising executives who are interested in his services to send an email to: lance.gattoni@gmail.com

His services include consulting engagements and/or where the alignment is strong he is currently available for fulfilling a key career position on your team. Either way, by adding him to complement your team, you will be the sponsor for quickly driving collaborative innovations and achievements that make a difference.





This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.

Thursday, 28 July 2011

Will the government get serious on cloud security, data privacy?

When the federal government finally does undertake the task of legislating around cloud computing, it seems very likely that security measures and data privacy will drive the ship. On Tuesday the TechAmerica Foundation’s CLOUD2 commission announced a data- and security-heavy set of recommendations to guide the federal government’s efforts in regulating, adopting and promoting the cloud, following up on a recent Brookings Institution discussion on a proposed Cloud Computing Act that focuses on those two issues. This isn’t surprising, given that these are two areas in which the government can most directly affect the nature of the cloud.

I covered TechAmerica’s CLOUD2 commission when it kicked off in April, highlighting its mission to advise the Obama administration on cloud computing best practices. The commission is comprised of representatives of more than 70 organizations and is spearheaded by Salesforce.com CEO Marc Benioff. Of the 14 recommendations it made today, 8 of them are focused on security and/or data privacy. They call for everything from the creation of an industry-wide security framework to updating the Electronic Communications Privacy Act (also the goal of the Digital Due Process coalition) to leading the charge to open up transnational data flows across cloud infrastructure.

The commission also calls for, among other things, increased data portability among clouds — something Commissioner Kurt Roemer of Citrix told me it would back in April — and for the modernization of our broadband infrastructure to better support cloud services.

Here’s one particularly meaty recommendation from the report summary released today:

Transnational Data Flows – Recommendation 6 (Government/Law Enforcement Access to Data): The U.S. government should demonstrate leadership in identifying and implementing mechanisms for lawful access by law enforcement or government to data stored in the cloud.

Under this recommendation, the Commission suggests three steps to increase clarity around the rules and processes cloud users and providers should follow in an international environment. Without U.S. leadership and cooperative international efforts, the world will face a far more complex legal environment, one that is not conducive to fully leveraging the cloud. The three steps are: (1) modernize legislation (the Electronic Communications Privacy Act) governing law enforcement access to digital information in light of advances in IT; (2) study the impact of the USA PATRIOT Act and similar national security laws in other countries on companies’ ability to deploy cloud in a global marketplace; and (3) have the U.S. government take the lead on entering into active dialogues with other nations on processes for legitimate government access to data stored in the cloud and processes for resolving conflicting laws regarding data.

A fuller version of the report is available here.

The CLOUD2 commission’s recommendations come just more than a month after the Brookings Institution convened a panel to discuss proposed legislation called the Cloud Computing Act of 2011. As I explained at the time, that potentially forthcoming bill will focus on cybersecurity practices and punishments, as well as providing clarity on moving and storing data across international boundaries. The transcript of that panel is available here.

Again, it’s not surprising that much of the talk about how the federal government might get involved with cloud computing focuses on security and privacy. After all, these are areas where it can more easily effect change because it can define policy rather than trying to dictate technological standards. Only the federal government can enact federal security-breach-notification laws like CLOUD2 suggests or rewrite the ECPA to bring the Fourth Amendment up to speed to how and where data is stored in the cloud. The federal government is certainly the only institution in our country that can enter into the international data treaties that both CLOUD2 and the senators proposing the Cloud Computing Act think are necessary.

On topics such as interoperability and uniform security protocols, though, the government likely will have to tread lightly and lead with its checkbook. Although both are laudable goals, they’re probably best left for the companies involved to solve. Cloud computing might be a sea change in the way we access IT, but it’s ultimately not too different from past standardization efforts that were driven by the private sector looking to increase revenues while making consumers’ lives easier. They weren’t always pretty, but it’s probably not the government’s place to decide how clouds will be built or how they’ll work together.

In fact, private-sector efforts around both interoperability and security standards already in place. The Cloud Security Alliance is focused on security, and a new organization called the Open Cloud Initiative launched today to push for interoperability among cloud platforms.

The government does have a mega IT budget, though, and is pushing a cloud-first strategy when it comes to buying new resources. Amazon Web Services, Google and others already have proven willing to bend to the government’s needs in order to get its business, so perhaps it can drive industry standards around interoperability and security by demanding certain levels of both in order to get federal business.

Image courtesy of TechAmerica Foundation

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

window.fbAsyncInit = function() {FB.init({appId: 180650338636285, status: true, cookie: true, xfbml: true});FB.api({method: 'links.getStats',urls: 'http://gigaom.com/cloud/will-the-government-get-serious-on-cloud-security-data-privacy/'},function(response) {jQuery('#react-fb-count-button').html(response[0].commentsbox_count);});FB.Event.subscribe('comment.create', function(response) {var ajaxurl = 'http://gigaom.com/wp-admin/admin-ajax.php?action=new_fb_comment&post_id=';jQuery.get(ajaxurl + 383900);});};var e = document.createElement('script');e.type = 'text/javascript';e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js';e.async = true;document.getElementById('fb-root').appendChild(e);

var _comscore = _comscore || []; _comscore.push({ c1: "2", c2: "6036014" }); (function() { var s = document.createElement("script"), el = document.getElementsByTagName("script")[0]; s.async = true; s.src = (document.location.protocol == "https:" ? "https://sb" : "http://b") + ".scorecardresearch.com/beacon.js"; el.parentNode.insertBefore(s, el); })();

Click to log in with: Not you? Remember me Submitting comment...
;(function($){$.fn.trackClick = function(){// track the clicktry {_gaq.push(['_trackEvent', this.parents('[id!=""]:first').get(0).id, 'clicked', (this.text() || this.children('img:first').attr('alt'))]);}catch (err) {}// wait a moment for the tracking to process, then follow the linksetTimeout('document.location = "' + $(this).attr('href') + '"', 200);};$('#brand-explorer a, #navigation a, .widget-wrap a').click(function () {$(this).trackClick();return false;}); })(jQuery);

View the original article here


This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.

Friday, 22 July 2011

CX funding shows cloud storage startups are still hot

Judging by the latest news out of startup cloud-storage provider CX, heavyweight investors are still bullish about newcomers to the space. CX announced on Thursday that it has closed on a Series B funding round worth $5 million led by Eric Schmidt’s venture capital firm TomorrowVentures, bringing its total VC investment to $10 million.

Also on Thursday, CX said it has acquired its competitor FileDen for an undisclosed sum. The deal brings CX’s user base to more than 3.5 million, the company said.

CX, which stands for “Cloud Experience,” bills itself as an open storage platform that platform overlays users’ social graphs to enable collaboration with anyone on any Internet-connected device. The company is most often compared to storage services like Dropbox and Box.net. CX claims it is differentiated from its peers by having better searching and sharing capabilities along with data visualization features.

CX is currently free, but it plans to launch paid product plans later this summer that will charge consumers about $10 per month and developers $40 per month.

It has been clear for years now that cloud storage technology is hot — tech industry giants such as Amazon, Google and, most recently, Apple have made big moves into the area — and CX is looking to ride that wave. According to CEO Brad Robertson, CX is currently in discussions to close on $50 million in series C funding early this fall.

Along with TomorrowVentures, CX is backed by Hanna Capital, Clarington Capital, and Clearwater Capital.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

window.fbAsyncInit = function() {FB.init({appId: 180650338636285, status: true, cookie: true, xfbml: true});FB.api({method: 'links.getStats',urls: 'http://gigaom.com/cloud/cx-cloud-storage-funding/'},function(response) {jQuery('#react-fb-count-button').html(response[0].commentsbox_count);});FB.Event.subscribe('comment.create', function(response) {var ajaxurl = 'http://gigaom.com/wp-admin/admin-ajax.php?action=new_fb_comment&post_id=';jQuery.get(ajaxurl + 378035);});};var e = document.createElement('script');e.type = 'text/javascript';e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js';e.async = true;document.getElementById('fb-root').appendChild(e);

var _comscore = _comscore || []; _comscore.push({ c1: "2", c2: "6036014" }); (function() { var s = document.createElement("script"), el = document.getElementsByTagName("script")[0]; s.async = true; s.src = (document.location.protocol == "https:" ? "https://sb" : "http://b") + ".scorecardresearch.com/beacon.js"; el.parentNode.insertBefore(s, el); })();

Click to log in with: Not you? Remember me Submitting comment...
;(function($){$.fn.trackClick = function(){// track the clicktry {_gaq.push(['_trackEvent', this.parents('[id!=""]:first').get(0).id, 'clicked', (this.text() || this.children('img:first').attr('alt'))]);}catch (err) {}// wait a moment for the tracking to process, then follow the linksetTimeout('document.location = "' + $(this).attr('href') + '"', 200);};$('#brand-explorer a, #navigation a, .widget-wrap a').click(function () {$(this).trackClick();return false;}); })(jQuery);

View the original article here


This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.

Cloud computing could lead to billions in energy savings

Another study out this week has found that if companies adopt cloud computing, they can reduce the energy consumption of their IT and save money on energy bills. The report, created by research firm Verdantix and sponsored by AT&T, estimates that cloud computing could enable companies to save $12.3 billion off their energy bills. That translates into carbon emission savings of 85.7 million metric tons per year by 2020.

The Verdantix report isn’t the first one to deliver such a finding. Last year Pike Research found that cloud computing could lead to a 38 percent reduction in worldwide data center energy use by 2020, compared to what the growth of data center energy consumption would be without cloud computing. Another study from Microsoft, Accenture and WSP Environment and Energy last year found that moving business applications to the cloud could cut the associated per-user carbon footprint by 30 percent for large, already-efficient companies and as much as 90 percent for the smallest and least efficient businesses.

All of that is good news. Cloud computing is one of the most disruptive Internet infrastructure shifts to happen in recent years. Web companies have been embracing cloud computing in order to buy flexible, lower cost, on-demand computing power from companies like Amazon. And these cloud computing services generally replace the computing that would have been done by companies’ own in-house computing resources.

However, it’s always good to take these studies with a grain of salt. There’s a reason AT&T and Microsoft are looking into the energy efficiency of cloud computing: they sell cloud computing services.

Other studies have also found that cloud computing isn’t always the most energy efficient computing option, and in certain instances the cloud can be more energy intensive than traditional in-office computing. A report from University of Melbourne researcher Rod Tucker and his team, which I wrote about for GigaOM Pro (subscription required), found that cloud computing can indeed save energy when it leads simply to the consolidation of servers, but looking at three different applications of cloud computing — storage, software and processing —  energy efficiency savings are negated in some scenarios.

For example, one such instance when the cloud isn’t more efficient, according to Tucker’s research, is when companies are using cloud computing for storing data. Tucker found that when the number of downloaded and accessed files becomes larger (more than one download per hour for a public cloud storage service), those energy efficiency gains are erased.

There’s enough research out there by now that shows that cloud computing is overall more energy efficient than traditional in-house computing. Which is great news for Internet companies and cloud computing providers. The growing energy consumption of the Internet, data centers and our always-on connected devices will only continue to grow, so efficiency trends will only to continue to become important.

Image courtesy of The Planet.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

window.fbAsyncInit = function() {FB.init({appId: 180650338636285, status: true, cookie: true, xfbml: true});FB.api({method: 'links.getStats',urls: 'http://gigaom.com/cleantech/cloud-computing-could-lead-to-billions-in-energy-savings/'},function(response) {jQuery('#react-fb-count-button').html(response[0].commentsbox_count);});FB.Event.subscribe('comment.create', function(response) {var ajaxurl = 'http://gigaom.com/wp-admin/admin-ajax.php?action=new_fb_comment&post_id=';jQuery.get(ajaxurl + 380004);});};var e = document.createElement('script');e.type = 'text/javascript';e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js';e.async = true;document.getElementById('fb-root').appendChild(e);

var _comscore = _comscore || []; _comscore.push({ c1: "2", c2: "6036014" }); (function() { var s = document.createElement("script"), el = document.getElementsByTagName("script")[0]; s.async = true; s.src = (document.location.protocol == "https:" ? "https://sb" : "http://b") + ".scorecardresearch.com/beacon.js"; el.parentNode.insertBefore(s, el); })();

Click to log in with: Not you? Remember me Submitting comment...
;(function($){$.fn.trackClick = function(){// track the clicktry {_gaq.push(['_trackEvent', this.parents('[id!=""]:first').get(0).id, 'clicked', (this.text() || this.children('img:first').attr('alt'))]);}catch (err) {}// wait a moment for the tracking to process, then follow the linksetTimeout('document.location = "' + $(this).attr('href') + '"', 200);};$('#brand-explorer a, #navigation a, .widget-wrap a').click(function () {$(this).trackClick();return false;}); })(jQuery);

View the original article here


This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.

Thursday, 21 July 2011

The cloud: now for connecting electric cars

The car, in particular the electric car, is the latest device to get plugged in to the cloud. Coulomb Technologies, which makes networked electric-vehicle charging stations, plans to start selling cloud-based services to allow charge-station owners to manage charging and billing. The service also uses the cloud to connect an EV driver to the nearest charging station and deliver on-demand support and billing services.

The service can be used by the electric-vehicle-charging-station owner — such as a utility, a commercial building owner or an employer — to manage the rates and billing of the service, recoup the investment of getting the stations installed and track the usage of the stations. A charging station can cost anywhere from $2,500 to $6,000, depending on the features of the equipment, according to a recent report from the Electric Power Research Institute (EPRI).

A corporate customer like Google, which installed 70 Coulomb EV charging stations at its headquarters, can monitor how many carbon emissions it’s saving with its installation, and it can also dig into the data to learn more about electric-vehicle charging habits (something Google has been particularly keen to learn about).

If electric vehicles ever become mainstream, utilities will have to manage the charging times and rates so that EVs don’t overload the grid in certain neighborhoods. The cloud-based Coulomb services will enable utilities — like current customers Austin Energy and Orlando Utilities Commission — to manage EV charging alongside their grids. According to the EPRI report, five utilities already have done a lot of research on how to provide management services of EVs: Southern California Edison, Detroit Edison, Progress Energy, Georgia Power and Sacramento Municipal Utilities District.

Internet companies and automakers are also looking to build cloud-based car-connected services. Microsoft and Toyota announced earlier this year that they’ll jointly invest $12 million in a bid to build a cloud-based platform that will connect cars, homes and electrical smart grids. Eventually, Toyota says, nonelectric cars could be connected into the service, too.

Using the cloud to connect cars and EVs makes sense, as there will continue to be more and more data associated with these essentially extra-large and expensive devices. GM’s electric car, the Volt, has 10 million lines of code and an IP address. Keeping EVs connected to the cloud via wireless networks also could be a valuable tool to fight range anxiety, or the perception that an EV’s battery has limited range and will run out of power, leaving the driver stranded.

Image courtesy of Nissan

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.


View the original article here


This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.

Monday, 18 July 2011

The cloud is like MMA; VMware, Citrix in main event

I’m an MMA fan. The sport of mixed martial arts combines the multiple disciplines of wrestling, boxing, and jiu jitsu into one combat sport fought in an eight-sided cage called the octagon. These athletes are modern-day gladiators. Most come from a core background in one discipline (a collegiate wrestler, for example) and then have to develop secondary skills – boxing and jiu jitsu in this case – to really become professionally competitive in the sport.

I am also an entrepreneur, technologist and CEO. For me, announcements this past Tuesday in the cloud world are a metaphor for two highly trained MMA fighters stepping into the ring after a two-year training camp spent developing their secondary disciplines in preparation to battle for Cloud Service Provider (CSP) supremacy.

I am, of course, talking about Citrix’s acquisition of Cloud.com  and VMware’s announcement of vSphere 5  as the foundation of its move toward a comprehensive cloud OS. To me, both announcements foreshadow three major shifts that will occur in the coming years in the cloud computing and CSP space:

1. A cloud OS will reign. The versions of Windows, Linux, Solaris, AIX, UX, etc., as we know them today will be rendered useless by the science of cloud computing. This will take some time, and perhaps elements of this layer will be repurposed, but both the hypervisor and the application will incrementally subsume functionality historically performed by the OS. We will see more intelligently pre-configured virtual machines (VMs) running workloads for increasingly intelligent applications pre-configured for a particular hypervisor.

We are already starting to see this with JEOS (Just Enough Operating System) VMs that intelligently adjust for attributes such as memory allocation through their interaction with the hypervisor. Dynamic resource allocation to VMs and the ability of the applications to drive such requests will represent the infrastructure of the future. This will be a core shift in the cloud computing space and will drive the orientation of the foundational levels of the cloud stack, Infrastructure as a Service and Platform as a Service.

This is like having a core base skill of wrestling for an MMA fighter. All other skills sets leverage the base. A case could be made that VMware is the better wrestler today given its heritage of innovation at the proprietary hypervisor and hypervisor service-provisioning level. On the other hand, it could be argued that Citrix’s acquisition of Cloud.com — which has enabling IaaS software IP that will undoubtedly provide the base to Citrix’s efforts to build out its own open source-based cloud OS (perhaps augmenting or even usurping its own Project Olympus efforts) — gives it the better wrestling skills.

Too close to call at this point. They have two different styles of wrestling.

2. The hybrid cloud will become the most important cloud. Most of the dialogue today is about which cloud is best for a particular enterprise when considering the choice of commodity clouds (Internet-facing web-scaling clouds such as Amazon Web Services),  public clouds (multi-tenant clouds that can either be Internet- or private-access-based) or private clouds (essentially, dedicated single-tenant virtualized technology).

Key considerations are the obvious ones: security, performance, elasticity, etc. Greenfield applications generally have entirely different requirements than legacy back-office application migration (a space more commonly being referred to now as the enterprise cloud). It goes on and on.

The space is entirely over-marketed in a self-serving manner toward the strengths or away from the weaknesses of the CSPs doing the marketing. All of that will get sorted out in the next 18 to 24 months. Simply put, the best technologies for the required use case will win. If not, shame on the buyer.

As a CSP, the real juice in forward markets will be in defining and controlling the hybrid cloud. By hybrid cloud, I mean the heterogeneous solution of on-premise private clouds interacting with off-premise public clouds. The better the tools are to build on-premise private clouds, the more bursting will occur into off-premise public clouds. It will be a virtuous cycle.

Today, there are very few companies that functionally make money on both ends of that spectrum. I look at both the Citrix and VMware announcements as moves toward filling this future hybrid cloud opportunity. Citrix moves toward this in its acquisition of Cloud.com, whose software assets have been used to build such scaled private clouds as Zynga’s Z Cloud. VMware moves toward this simply by realizing it is no longer good enough to own the private cloud hypervisor market. It is now looking to develop true cloud-management products.

They both have pieces of the off-premise puzzle (and plenty of capital to develop it further) but neither is there yet. This is where this race will be. This area will require both skill and endurance and a tough chin, similar to what the art of boxing means to the MMA fighter. Given VMware’s massive share of the enterprise hypervisor market, it gets the nod as the better boxer at this point, but it’ll have to be wary of Citrix’s open source agility.

3. CSPs will need it all. It’s obvious that having a full cloud stack will matter, but it will need to be integrated in a manner that does not exist today and will broaden its definition. Having an enterprise-grade hypervisor in this fight is sort of like breathing: It will keep you alive, but won’t get your arm raised in victory.

Both VMware and Citrix have battle-tested hypervisors in ESXi/vSphere and XenServer, respectively. Both have data center assets and now IaaS and cloud OS capabilities. Both have management tools. Both are in their early stages with their PaaS offerings, with VMware’s recent launch of Cloud Foundry (interestingly, an open source PaaS) and Citrix’s OpenCloud, which leverages OpenStack.

Both also have a software portfolio, but this is where I think Citrix separates itself in a number of areas. One example is the network/WAN optimization for on-premise to off-premise connectivity noted above. The maturity of Citrix’s HDX technologies in conjunction with Xen Desktop have focused on solving the “last mile” of on-premise user experience with off-premise workload processing.

But all of these components must be interconnected. This is about a breadth of skill sets that work together in a “cloud spectrum” of sorts. This is similar to the breadth of specific technical skills that the MMA fighter must master in the art of jiu jitsu. At this stage, I give the nod to Citrix for having the better jiu jitsu.

However, there are many other contenders in the division. The list is long. Don’t count out AWS with its impressively consistent feature release cycles. Rackspace has approximately $1 billion of dedicated hosting and managed services clients to direct to the virtues of OpenStack. Microsoft, IBM and others have well-known component solutions but have demonstrated a lack of agility, among other things, to enable the true spectrum.  There are contenders in Red Hat, Joyent, Virtustream and many others.

There will be no shortage of opportunities for very smart component technologies. The major players will either have to develop these things or acquire them. Boxers must learn to grapple, kick, and employ submission techniques. Wrestlers must learn to strike effectively and attack from their guard. One-dimensional CSPs need to become effective three-dimensional CSPs in order to win a belt.

VMware and Citrix just look like the top contenders right now. Yesterday was the weigh-in for their title fight. But expect more days like last Tuesday.

Rodney Rogers is co-founder and CEO of Virtustream.

Image courtesy of Flickr user fightlaunch.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.


View the original article here


This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.

Wednesday, 13 July 2011

How VMware wants to be the OS for the cloud

VMware on Tuesday morning launched the latest version of its vSphere virtualization management software, as well as a suite of integrated products for managing cloud computing environments. There isn’t much new aside from the capabilities in vSphere 5, but VMware’s timing was ideal to take some of the wind out of rival Citrix’s sails after it announced the acquisition of Cloud.com this morning.

VMware CEO Paul Maritz took the stage at a press conference/web conference this morning to lay out the case for why vSphere 5 and the new Cloud Infrastructure Suite are needed. His pitch was similar to what he told Om at Structure 2011, which is that we’re moving beyond the PC era, which means new applications, new platforms and entirely new ways that employees will interact with IT. For that to happen, infrastructure management just “needs to disappear.”

Essentially, Maritz explained, harkening back to his Microsoft roots, the new cloud suite is like Office for building automated data centers. It’s composed of a collection of previously disparate VMware products now designed to be easily deployed atop vSphere environments. By that logic, vSphere would be the Windows operating system. Gary Orenstein nailed the VMware-Microsoft analogy in greater detail in a May post.

Maritz also made a strong case for deploying VMware-based applications in the public cloud, broadening the compnay from it’s private cloud roots. Maritz claimed that more than 2,000 service providers currently utilize vCloud to some degree in operating their cloud computing offerings.

VMware CTO Steve Herrod also took the stage to discuss the slew of new features within the various components of the suite designed to make vSphere more dynamic and reliable when operating across larger, geographically dispersed server pools similar to the public clouds that VMware and its private-cloud brethren seek to emulate. Among the highlights are highly powerful VMs running within vSphere, a app-store-like console for provisioning VMs, advanced disaster recovery methods, intelligent policy management, dynamic storage migration and advanced securtity capabilities. In all, there are hundreds of new features, which Herrod discusses in more detail on his blog.

VMware also is adjusting the licensing model for vSphere 5, focusing on pooled virtual memory allocations in stead of the number of physical hosts. Although this still presumably caps the amount of resources available, it does allow for cloud-like flexibility of moving resources where they’re needed. Software licensing has always been a sticking point around virtualization and cloud computing because vendors want those upfront fees, but users want the flexibility of adding new servers without necessarily adding more costs. Unless VMware were to go full pay-per-use and abandon license fees altogether, this might be among the happiest balances it could find.

This was not the groundbreaking event that VMware promised it would be, but the launch should please VMware customers looking for a higher-performance, more automated and generally cloudier VMware experience. Following on the heels of the Citrix-Cloud.com news, however, it looks like VMware will have to keep its foot on the innovation gas pedal to keep from finally feeling the pressure of an open source cloud movement led by XenServer, OpenStack and Red Hat’s KVM-based approach.

VMware has invested its future in a spate of application acquisitions and its Cloud Foundry platform as a service, but those are longer-term strategies still in their infancies. Infrastructure is what’s making money now, and although VMware has the lead in revenue and deployments, it doesn’t have all the momentum.

Related content from GigaOM Pro (subscription req’d):

window.fbAsyncInit = function() {FB.init({appId: 180650338636285, status: true, cookie: true, xfbml: true});FB.api({method: 'links.getStats',urls: 'http://gigaom.com/cloud/vmware-exposes-its-plans-to-be-the-os-for-the-cloud/'},function(response) {jQuery('#react-fb-count-button').html(response[0].commentsbox_count);});FB.Event.subscribe('comment.create', function(response) {var ajaxurl = 'http://gigaom.com/wp-admin/admin-ajax.php?action=new_fb_comment&post_id=';jQuery.get(ajaxurl + 375106);});};var e = document.createElement('script');e.type = 'text/javascript';e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js';e.async = true;document.getElementById('fb-root').appendChild(e);

var _comscore = _comscore || []; _comscore.push({ c1: "2", c2: "6036014" }); (function() { var s = document.createElement("script"), el = document.getElementsByTagName("script")[0]; s.async = true; s.src = (document.location.protocol == "https:" ? "https://sb" : "http://b") + ".scorecardresearch.com/beacon.js"; el.parentNode.insertBefore(s, el); })();

;(function($){$.fn.trackClick = function(){// track the clicktry {_gaq.push(['_trackEvent', this.parents('[id!=""]:first').get(0).id, 'clicked', (this.text() || this.children('img:first').attr('alt'))]);}catch (err) {}// wait a moment for the tracking to process, then follow the linksetTimeout('document.location = "' + $(this).attr('href') + '"', 200);};$('#brand-explorer a, #navigation a, .widget-wrap a').click(function () {$(this).trackClick();return false;}); })(jQuery);

View the original article here


This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.

Friday, 8 July 2011

Hands on with Amazon Cloud Player for iPad

When Amazon began offering MP3 song downloads without digital rights management (DRM) back in late 2007, I quickly jumped on the bandwagon. Even as an iPhone owner at the time, I ceased buying my music from iTunes, which later dropped DRM. Amazon continued to lead the way in mobile music this year by introducing its Cloud Player software for Android devices and integrating its music store with the online storage service. Amazon’s Cloud Player website can stream stored music as well, but until now, it hasn’t worked well on Apple’s iPad.

Amazon changed that yesterday with the introduction of Cloud Player support for the iPad browser. While you’d think the website might work for other iOS devices, too, it really doesn’t based on trying it on my iPod touch. Using the iPad, however, brings a solid, but basic, music streaming experience. Hitting the http://www.amazon.com/cloudplayer link on my iPad brings up a clean two-paned interface: categorized music and playlists on the left, and track details on the right. Both of these areas are scrollable, so you can view long lists of albums, tracks or playlists. The web-based app works in either portrait or landscape mode.

WP_SLIDESHOW_IMAGES = {load: 'http://s1.wp.com/wp-content/mu-plugins/slideshow/slideshow-loader.gif?m=1307725212g',prev: 'http://s1.wp.com/wp-content/mu-plugins/slideshow/prev.png?m=1307725212g',next: 'http://s0.wp.com/wp-content/mu-plugins/slideshow/next.png?m=1307725212g',stop: 'http://s0.wp.com/wp-content/mu-plugins/slideshow/stop.png?m=1307725212g',play: 'http://s2.wp.com/wp-content/mu-plugins/slideshow/play.png?m=1307725212g'};WP_SLIDESHOW_LABELS = {next: 'Next',prev: 'Previous',stop: 'Toggle Playback'};WP_SLIDESHOW_BLOG_INFO = {blogId: '14960843',subDomain: 'gigaom2',userId: '0'};This slideshow requires JavaScript.

Simple controls run along the bottom of the web page. You can play, pause, or skip/rewind tracks with touch buttons or enable random and repeat playback. The currently playing song appears, along with a progress meter for scrubbing or skipping around the song. Unlike a native music application however, you scrub through the song by dragging the progress button. Instead, you have to tap on the meter to jump to a specific point in the song.

Creating a new playlist is simple, as is adding songs or whole albums to an existing playlist. There’s also a link at the top right to buy the MP3 album from Amazon, but that seems silly to me: If you already own the song or album and have it stored on Amazon’s servers, why would you need a link to the album in Amazon’s MP store? The only benefit I can see is for people looking to see additional artist or album information, although this could be handy for those that own just a few tracks and want to complete an album.

For all intents and purposes, outside of the track scrubbing, the Amazon Cloud Player site on iPad simulates a basic music application reasonably well. The music quality sounds no different from when I stream my tunes on a desktop browser, and thanks to iOS multitasking, I can use other apps on my iPad while streaming music over the web. I have noticed that the service runs best if it retains the focus, however. When using another app, the music tends to stop after a song or two. A quick return to the web page nudges the stream to start up right away: something I hope is addressed in the future.

In contrast to the service in a desktop browser or the native Amazon MP3 app for Android, there’s no function to either upload music or download music for local storage in the iPad web version. I’ll stick to using iTunes for that as needed, but for now, I’m happy to enjoy my Amazon stored music on the iPad. And although in 2007, I felt very “locked in” to Apple’s hardware when it came to music and media, thanks to Amazon, I feel I have some real options today.

Related content from GigaOM Pro (subscription req’d):


View the original article here


This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.