What are the IT infrastructure services?

What are the IT infrastructure services?

IT infrastructure refers to the complex hardware, software, network resources and services required for the continuation, procedure and supervision of a venture IT situation. It allows an organization to deliver IT infrastructure services companies and services to its employees, partners and/or customers and is habitually internal to an organization and deployed within owned facilities.

Now a days emerging technologies such as the SMAC stack have changed the IT infrastructure site across organizations and domain. To keep up with the market and retain their competitive edge, organizations have to overhaul inheritance infrastructure to modernize enterprise functioning and ultimately improve profitability. IT consulting firms in Virginia Infrastructure Services portfolio can help organizations modernize and renew processes across intricate IT landscapes.

IT infrastructure consists of all components that somehow play a role in overall IT consulting firms in Virginia and IT-enabled operations. It can be used for internal business operations or developing customer IT or business solutions.

Typically, a standard IT infrastructure consists of the following components:

Hardware: Servers, computers, data centers, switches, hubs and routers, and other equipment
Software: Enterprise resource planning (ERP), customer relationship management (CRM), productivity applications and more
Network: Network enablement, internet connectivity, firewall and security
Meat ware: Human users, such as network administrators (NA), developers, designers and end users with access to any IT appliance or service are also part of an IT infrastructure, specifically with the advent of user-centric IT service development.

Optimize IT assets with the agile infrastructure services

From cloud enablement of legacy platforms to desktop application journey and purpose virtualization, scalable Infrastructure Services from IT infrastructure service companies can help organizations address challenges in designing, provisioning, and maintaining IT infrastructure. They helps deliver greater business contact from IT through services like: Transformation Services, Managed Services, Consulting and Professional Services, Cloud Solutions and Next-Gen Platform-Centric Solutions.

From cloud enablement of legacy platform to desktop application immigration and application virtualization, scalable Infrastructure Services can help organizations address challenges in designing, provisioning, and maintaining IT infrastructure. They helps deliver greater business impact from IT through services like: Transformation Services, Managed Services, Consulting and Professional Services, Cloud Solutions and Next-Gen Platform-Centric Solutions.

IT Infrastructure Services companies portfolio includes:

IT service desk
End user support
Enterprise systems & network management
Data center consolidation and hosting
Database services
Cloud hosting (AWS)
Project management and governance
Virtualization Solutions (VMware, Microsoft, Citrix)

With widespread experience in delivering and listening carefully infrastructure management services across domains, IT infrastructure offers organizations the benefits of well-built industry partnerships such as SAP, Cisco, Oracle and Microsoft. Enable your business to run faster, smoother, and more reliably with complete infrastructure services.

How technology compresses everything: Three unthinkable outcomes that can benefit man

How technology compresses everything: Three unthinkable outcomes that can benefit man

 

When man first conquered the moon, NASA was equipped with the most sophisticated computer equipment of its time to launch the first manned missions to space. There were tons of equipment occupying almost an entire office building of wires, circuit boards, and hardware to harness the technology.

Decades later, the same amount of technology multiplied dozens of times can be made to fit inside a regular smartphone. That is how technology has evolved since then and the same goes for microscopic to nano-technology.

Nowadays, automated and off-site managed service provider VA could run an entire software and hardware infrastructure for a business more advanced than the technology available in the early days of space missions.

As time goes by technology follows suit to make devices much smaller, yet more powerful.

Cloud storage and network capabilities
Portability and mobility is the name of the game. Cloud networks and services are one of the most promising technologies that changed the business landscape and those that have embraced it have gained a better advantage over competitors.

Goldman Sachs reported that cloud computing infrastructure systems have a compounded annual growth rate (CAGR) of 30% from 2013 to 2018, significantly higher compared to overall enterprise IT figures at 5%.

By the end of the year, 33% of companies have embraced cloud technology with 86% of companies spending a chunk of its IT budget on cloud-based services whether managed in-house or outsourced to cloud infrastructure providers.

The evolution of scale
Through cutting-edge wireless and nanotechnologies, scientists are a step closer to helping people with severe neurodegenerative diseases that affect vision.

Scientists from the Nanovision Biosciences Inc and the University of California, San Diego have developed an implant prototype with the ability to respond to light signals in the retina.

The high-resolution retinal prosthesis uses nanowires and wireless electronics that work in the same way as neurons in the retina, which was found successful with in vitro laboratory tests using lab rats. Researchers have developed the technology with the aim of helping those who are diagnosed with sight-affecting diseases like retinitis pigmentosa, macular degeneration, and diabetes complications.

Revolutionary indoor farming technology
The global indoor farming revolution has taken the concept of sustainable farming and food production to a whole new level. One should no longer be restricted to the idea of having acres of land to create a sustainable farm.

Technology has equipped farmers with the capability to produce food within a, enclosed facility and could even increase yield to meet demand. Modern indoor farms are capable of reducing crop wastage, pest-free organic crops, and high-yielding sustainable farming methods.

The technology is expected to grow at a CAGR of 20.8% between 2018 and 2022.

 

Velocity–Web Operations

I was really impressed with the Velocity conference, the organization, the content and the participants. The conference was a perfect size, not too big while at the same time not too small. Velocity had two main themes, Web Performance & Optimization and Web Operations. Common topic of discussion were performance, operations, change management, configuration management, monitoring, optimization, metrics, mobile, devops, web ops, agility, JavaScript, node.js, and dirty-apps (data intensive real time apps).

I gravitated to the web operations talks but also sat in on some of the mobile and web optimization discussions. I heard multiple times was how web performance and optimization has grown and really delivered value over the past five years, whereas the web operations sector has not grown at the same rate. From a web optimization perspective, browsers are faster, JavaScript has been optimized and delivers 6 times the performance as 5 years ago, node.js has become a framework for JavaScript on both the client and server side, and an entire market has evolved around web optimization.

From an operations perspective, Amazon, Facebook and Google all have efforts focusing on changing their data center and application delivery strategies, moving away from an older style of enterprise capacity planning which focused on scaling vertically at the application level to a newer web level of capacity planning and optimization that delivers a flexible infrastructure across all applications. Both Facebook and Google have created standards around server configurations, builds, deployments and decommissions, and Facebook has custom built their data center, squeezing cost out of server builds and power delivery. Amazon focuses on capacity planning and has recently moved all Amazon online store applications off of physical servers onto an ec2 infrastructure, putting all their apps into the cloud, even if it’s their cloud.

Other areas where web operations is adding value is in the continuous deployment and configuration management space. Common configuration management and deployment tools identified at Velocity were cf engine, puppet, chef and the cast-project. There was a lot of talk about devops which is basically introducing a operations strategy into the development process while at the same time introducing a development strategy into the operations process. Examples of this would be using a code repository for all applications builds and deployments, and from a development perspective, introducing the developers to the monitoring and performance stats before they release their code. I agree and fully support this strategy, however I do feel that there is a too much hype surrounding the “devops” term, and prefer the “web operations” term as a replacement for the devops term.

There was not a lot of discussion around gathering metrics however a number of folks did talk about monitoring, looking at your data and understanding your logs and data. John Allspaw gave an interesting talk on “Advanced PostMortem” where he spoke of Time to Detect TTD an incident, Time to Recover TTR from an incident and the overall impact time which is TTR-TTD. John made the point that the severity of all outages are not the same and each organization should define different levels of severity, and track TTD, TTR, Impact Time along with the severity level.

I was also looking for some help with deployments, and chef and the new cast-project look interesting, I am going to hold off on a configuration management systems at work and focus on building out our continuous deployment process. We have started this process in development using Bamboo, however our organization needs to commit to this strategy, and then scale out the current Bamboo infrastructure to accommodate building all of our apps. There was more talk about application servers in the cloud as opposed to application servers on premise so my application server question did not get answered. A couple of vendors offered sophisticated java monitoring products however both came with a steep pricetag. The one java monitoring product that I thought was interesting was dynaTrace which I will probably investigate.

A challenge that I have and I am sure that many other enterprise level organizations have is that our infrastructures are carved out and deployed on an application by application basis, which means that we have to pay for and build redundancy into every new application as opposed to the market leaders like Google, Amazon and Facebook who have redundant infrastructures and add applications onto their redundant infrastructures. The market leaders have had a lot of success with this infrastructure strategy, it is surprising that more large enterprises has not started down this same path.

I thought this was a valuable conference and next year I am going to recommend that we have two or more folks attend velocity. More information on Speaker Slides and Video can be found here.

The Open Social Web

I have been talking and writing about web 2.0, the open web, social networking, and social computing for a while now and now like to refer to all of this as The Open Social Web. In my version of the Open Social Web, all applications and content are built on open web standards that provide users with interoperability and control of their own data. While that may not be the case with many of the larger social networking sites, I like to think of it as a good goal to work towards. Adoption of social tools and applications have moved the digital conversation from blogs to social sites like Facebook, Linkedin, Twitter, Buzz and Identi.ca, and the adoption of free mobile tools has helped to fuel this migration. While many people lose site of the Open aspect of these tools and social adoption is still growing and that is good for everyone. Social tools are changing how, when and where we communicate and that benefits everyone. Openness leads to innovation, and innovation leads to better tools, applications and communication both within and outside of the social networking space. Location tools like Foursquare and Gowalla are great examples of innovation in the Social space and the wide adoption of desktop tools like Seesmic and TweetDeck prove that users want to pull all of their social data and communications into one tool or application and companies are innovating to make that happen. I use Seesmic more than TweetDeck and both are great social aggregators, however I am still waiting for the tool that produces one open stream with all my social data.

An Open Social Web built on top of the same open infrastructure components will lead to an increase in discovery and sharing across all social sites. A good starting point for learning about open infrastructure technologies is the W3C Incubator Group Report on “A Standards-based, Open and Privacy-aware Social Web”. The W3C Incubator Group Report on Social Web Standards makes a case for Open Social Web Standards and focuses on identity, profiles, social media, privacy, activity streams, accessibility, open social networking projects and business considerations.

Adoption of these tools and standards is growing however there is still a lot of user friction sharing data between the large social networking sites and when it comes time to move to or at least try a new social networking site. This is too bad because discovery of new social tools and friends is a big part of the social web, and there are many groups working to reduce this friction and find tools and let users take their credentials from site to site. OpenID, OAuth and XAuth are the prominent tools for logging into new social tools and services with your current credentials. Many of us have seen the Twitter, Facebook, Google and OpenID buttons presented when logging into new services and the advantage of using these buttons for authentication is access to your current friends at Twitter, Facebook or Goggle on the new service. This enables a pass through service back to your authentication point of choice which enables the user to easily identify current friends on the new service. This functionality helps with discovery and as I said, discovery is a big part of exploring the social web. Joseph Smarr and Jon Panzer outlined many of these tools in their Google I/O 2010 talk on Building fluid social experiences across websites.

Thoughts on Facebook

Thoughts on Facebook

In November of 2009, I read The Accidental Billionaires by Ben Mezrich which was a story the Founding of

Facebook and my impression based on the book was that Facebook’s founder Marc Zuckerberg screwed everyone on his way to building Facebook. I have recently just finished reading  The Facebook Effect by David Kirkpatrick and I have a different opinion of Marc Zuckerberg. Now I think he showed a lot of maturity and poise as he and the Facebook team met the challenges of growing and scaling Facebook into a network of over 500 million users, and I do not think that he screwed anyone.

I do not think that the lawsuit filed against Marc Zuckerberg by Harvard students Cameron Winklevoss, Tyler Winklevoss, and Divya Narendra held much merit as Zuckerberg did not sign a contract with them and in 2004, there were many examples of Social Networking companies like Live Journal, Friendster and Linkedin. I think it is a stretch to say that the Harvard Students came up with the Social idea and Zuckerberg stole it, as there were many other colleges doing the same thing, and you have to give Zuckerberg credit for executing on his idea, and then growing it past the college market.

I do think that Mark forced Eduardo out but Eduardo also shares some responsibility as Facebook was growing and Eduardo and Mark were fighting over money and control. Both Mark and Eduardo could have handled that situation better, however Eduardo eventually signed a shareholder agreement that allotted him 3 million shares of common stock for which he agreed to hand over all relevant intellectual property along with his voting rights to Mark Zuckerberg. Since that time Mark has reissued more stock diluting Eduardo’s shares, however Eduardo is still a very rich man with about 5% of Facebook stock.

What really changed my mind was reading about Zuckerberg’s passion to grow the business and to put growth before making money. According to David Kirkpatrick, Zuckerberg turn down many chances to sell Facebook but instead wanted to grow the service and make it into a utility. If Marc was just out to make a quick buck, he would have taken the money and run, however he did not. However Marc does come across as being obsessed with control of his company but who could really blame him.

I have not seen the movie the Social Network but now I will view it from a different perspective. I recently watched Zuckerberg’s interview at the Web 2.0 Summit 2010 and like what I heard from Zuckerberg. Two points that I heard more than once in Marc’s talk were:

  • “over the next five years almost every major product vertical is going to get rethought to be social”
  • “designed around people”

Other thoughts from Marc were:

  • “Facebook’s value system errors on the side of openness and portability”
  • feedback that we hear a lot is “we want control of the information that we put on the site” and a lot of people are asking for user control.
  • “Our default is to build an open platform, the stuff that we work on like groups location messages are like really core parts and are either distribution channels or we view are the foundational building blocks of the social graph.”

Virtualization and the path to the Cloud

Virtualization and the path to the Cloud

Have you noticed the shift in hype within computing news away from cloud computing? I have and I think this is great because the hype tends to distort the facts. Also, while the hype is fading, adoption of cloud computing continues to grow and is predicted to grow. Ryan Nichols published a story in Computerworld in August 2010 called “Cloud computing by the numbers: What do all the statistics mean?” where he outlined statistics from IDC, Gartner and Merrill Lynch all estimating dramatic increases in cloud computing adoption. Also in August of 2010, Andre R Hickey from ChannelWeb outlined a story on Small and Medium Business (SMB) spending on Cloud Computing sighting research from AMI partners that where they predict cloud computing adoption by SMB’s to exceed $95 billion by 2014. These are great predictions, however how are organizations going to make the switch from a dedicated server and network infrastructure to a scalable, on-demand model for delivering services?

Virtualization

Virtualization will be the key for many organizations as many large enterprises have already deployed virtualization and are learning what it takes to maintain and support a large virtualized infrastructure. Many think, and I agree that virtualization is the first phase toward cloud computing, while others already consider their virtual environments a cloud, which I do not want to dispute. Now, organizations need to master the support and scale of virtualization and then consider some sort of private or hybrid internal cloud before moving applications to the public cloud. Jon Oltsik is a Principal Analyst at the Enterprise Strategy Group and recently wrote how “Many firms still struggle with performance issues when trying to align physical networks, storage devices, and servers with virtualization technology” and outlines how many organizations are not using the full features of virtualization like VMWare’s vmotion and vcloud to improve reliability and service delivery. I agree with Jon’s assessment and want to emphasize the importance of mastering your virtual infrastructure and providing a high level of support to your virtualized resources. Cloud computing offers improved elasticity and scaling which is also found in many virtual infrastructures, but having these capabilities are one thing, whereas actually taking advantage of these features is another thing.

For many organizations this is a change in mindset as they move their critical processing off of dedicated servers onto shared virtual resources, because the management of that shared virtualized infrastructure is now key to the delivery of all services. Organizations that grow accustom to the shifting and deploying of virtual resources will be one step closer to cloud and will make a easier transition to the use of external resources in the cloud.

Google Buzz

Have you tried Google Buzz yet? I have and I like it a lot. It has many of the discussion and “follow” like features of Twitter, however unlike Twitter, you can see both sides of the conversation. Also, after using it for about two weeks there appears to be more technical content, links and discussions and fewer posts about going to lunch or what you had for dinner.

Granted, I follow a lot of tech folks so it’s not surprising that I would see a lot of tech content, however this appears to be an easier solution than Twitter, Identi.ca or RSS for searching and for following what others say about a post or topic. It takes the discovery and follow aspects of Microblogging and the full commenting aspects or regular blogging and combines this into one solution.

Another nice feature of Buzz is the ability to mute a buzz or post which I have found this useful especially when I am not interested in a topic but others are and there is a lot of comments. It’s a nice way to clean up the stream.

Buzz is built on Google Gmail and although it is still in the early adopter days I think that it is a great discussion platform and well worth looking at. If you do not have a Gmail account then I would recommend that you signup for a Gmail account today.

The world of Social Networking and Social Computing is growing with many tools that enable you to reach out and connect to others. In three to five years time everyone will have multiple social computing accounts similar to the way that many of us have multiple email address. So my question to you is why wait?

Jump in and participate, if you are not sure what to do, try one of these sites …

Linkedin.com, Facebook.com, Twitter.com, www.google.com/buzz, Identi.ca

The Open Web and Mozilla Drumbeat

I believe in and support the Open Web the Open Internet and the Mozilla Drumbeat Project. In my opinion, Openness leads to innovation and innovation leads to progress and progress is good for us all.

For many non-technical folks, there is a subtle confusion between the Web and the Internet, so an easy way to think about the difference is to think about our roads and highways and think of them as the Internet. This is the Infrastructure that we use every day and do not even think about, just like the Internet. The web, on the other hand, is like the cars and trucks that use our roads. Compare a car to a Browser or a Web Site that rides on top of the road and on top of the Internet. The power of the Web comes from the tools and protocols, programs, APIs, and data that make it work, however, the Web needs the Internet and without the Internet, we do not have a Web.

The Mozilla Drumbeat project is a project that advocates for the Open Web and they are looking for help from you and me. The concept of Drumbeat is to let everyone know about the freedoms of the internet, and how our freedoms may be slipping away from us, and what we can do to help keep the Internet and the Web Open and moving forward.

For many folks our digital roads (Internet) and digital cars (browsers and web sites) are already open and do not appear to be at risk, so what is all the fuss about ? Concerns start when you think about where we have come from and how we have evolved, and by looking at the experiences of other markets over a long period of time.

When Tim Berners-Lee created HTTP and HTML he did not license or charge royalties for his technologies, instead he move it into the public domain for other to use. This is how the Web grew and was literally the start of the Open Web. However since that time, we have witnessed many organizations attempting to carve out their piece of the Web by charging for access to content and data. After a while, charging for and restricting access to content and networks became the norm, but that is starting to change now and we want to make sure that it continues to change.

The Web is evolving along with Technology all around us. Television and Video are moving to the Web and the Internet is moving toward the Television. Fairly soon the Web will be just another channel to access content from your couch, but we want to make sure that all web content will reach that couch and is not restricted by the large media conglomerates responsible for Television today.

Think about the Television and Radio business in the early 60’s. Their focus was on the local news and local programming, and this was very beneficial to everyone. However, over time the heavyweights in the industry started buying all the stations and before you knew it most of the stations were owned by a small number of large media conglomerates. Less competition, less outside interference led to selective programming and reporting that is slanted toward the media conglomerate and not toward the people. We can’t let this happen to the Web.

There are forces in work looking to build a tiered Internet where the large players will have access to a larger faster internet tier that the rest of us will be restricted from. There is a large debate going on now for this Internet change. This if often referred to as Net Neutrality where the premise is that the Internet is a free and open resource and Internet Service providers should not carve up the Internet as it will lead to discrimination of content and traffic on the Internet. I am for Net Neutrality and the Open Web, and I encourage you to explore the Mozilla Drumbeat project and get involved.

For more information about Project Drumbeat, here is a slideshow from the Mozilla Foundation Executive Director Mark Surman on how you can get involved and help the Open Web.

Seven Technology Predictions for 2010

For the last few years I have been trying to identify the trends and directions of the Technology industry so I thought I would try again and make some predictions for 2010. You can read about my 2009 predictions here which were close with a few misses. Four of my predictions from last years are still high on my list for this year however the order has shifted in my mind. Here are my predictions 2010:

1) The Social Web

There is a lot of hype and a lot of work around introducing social tools and communities into our daily life. The majority of these tools are delivered via the web in communities like Facebook, Twitter, Identi.ca, Plaxo, and Friendfeed. Google also has a stake in this game with OpenSocial, Google Apps, and Google Groups, and Microsoft is getting in late with the Microsoft Azure Platform. Also, there is a lot going on behind the scenes in the social tools and apps space with Open efforts around Identity, Activity Streams and Discovery. These low-level efforts will help to shape our products and enable innovation. Social has bloomed on the web, and will be intergraded in many products in 2010, so now it is time for the enterprise.

One area that has helped the Social Web adoption is the integration of social applications on smartphones which enable users to stay connected to their Social communities. Another area is the use of inexpensive networked video cameras which has enabled content creators to quickly capture the moment and share it with their communities.

As Joseph Smarr once said …

The Web is going social … and the Social Web is going Open

I think this statement is very true ….

2) The Open Web

The web started as an Open project when Tim Berners-Lee released HTTP & HTML into the public domain in the early 90’s. Since then, organizations have been carving up their little piece of the web and restricting access to many. There are apps and solutions that are completely open, some that are partly open and others that are completely closed. There will be a big push to advocate for and adopt open strategies as more people start to participate in the Social Web and look to integrate all of their social tools.

The move toward an Open Web strategy has started to become organized with multiple groups participating in the effort. The Open Web Foundation was established in 2009 as a legal and standards based group for developing Open Technologies, and the Open Web Advocacy Group which is an open Google Group was create as a forum Open Web Developers, and the Mozilla Drumbeat Project was established as an advocacy group for the Open Web.

How about a real example of the Open Web. Twitter, not a totally open company or application, however they do have an open api which has benefited them greatly. By opening their api and access to twitter data, Twitter has allowed the creation of a sub market around twitter data. Hundreds of companies have been established which enhance twitter data and provide a service to twitter users. Without the Twitter open api, that would never happen.

3) Cloud Computing

Last year I had Cloud Computing at the top of my list and really expected adoption to be great than the adoption in 2009. That was partially due to security and legal concerns of Cloud Computing but also due to the fact that many enterprise organizations will need to change their infrastructure to take advantage of cloud computing, and most do not realize that. I think Cloud computing will continue to grow in 2010 with some of the larger more established vendors acquiring many of the smaller vendors.

I also see the adoption of private cloud increasing as many organizations get their first taste of cloud solutions in a controlled environment. There is a need for improved security and vendor accountability in the cloud space and I anticipate that there will be one or two new vendors on the scene in 2010 offering increased security and accountability at a premium.

4) Mobile Computing

Computing functionality is moving to the phone as evidence by the many SmartPhones available today. In 2010 we will see a whole new line of smart phones and Smart Phone adoption will increase in Enterprises as many organizations as many Business Managers realize that Smart phones allow for constant connectivity and Smart Phones adoption grows beyond the techie IT crowd.

I anticipate a line of semi-smart phones which allow some but not all of the features of a smart phone on a cheaper priced phone. Many folks are looking for basic phone services with limited to no data plans, however application store features will be available on all phones in 2010 as the providers look for more way to generate revenues.

The integration of social applications on smart phones has increased and will help to fuel the Social Web as smart phone users stay connected to their Social communities all the time.

5) Enterprise Social Computing

Adoption of Web 2.0 and Social tools in the Enterprise will increase however it will continue at a slow pace. Enterprises are adopting Enterprise Social Computing much like they adopted Intranets, in a slow and structured manner and need to get their feet wet before adopting any large scale organizational efforts. The good news is that more Business folks understand Enterprise Social Computing and can see the value of improved collaboration for their process.

Many organizations do not have a Enterprise Social Computing strategy however I see that changing in 2010 as many organization will come out with Enterprise Social Computing Policies for their users. This will be a clear indication for the user community of what is appropriate and what is not and will fuel Social Computing within the Enterprise.

6) Enterprise Infrastructures

Enterprise Infrastructures are changing. Most organizations have already adopted virtualization, while others are experimenting with cloud computing, and everyone is looking for strategies to decrease power and cooling requirements. Most larger organizations have data centers that were designed many years ago for large transactional type processing requirements and that has not changed as the processing continues on newer hardware but with the same old infrastructures. One reason why both Google and Amazon have become successful is because they are not tied to older transactional type architectures but instead created their own Architectural Stack which enable them to deliver massive computing power to end users. As more organizations start to work with private clouds the architecture required to support private clouds will become apparent and will start to fuel a change in applications and architectures.

The other Enterprise Infrastructure shift that I see is in the area of Identity Management and governance. Most organizations have at least three different methods of authentication in their organization including Active Directory and multiple LDAP’s all architected to be used behind the firewall. This is the year that organizations will start to look outside the Enterprise to join federated IDM’s for a subset of their users, customers and partners. Another Identity related solution will be Information Cards, which have gained adoption on the Web and eventually will make it’s way to the Enterprise.

7) Big Players in Technology Amazon, Apple, Google, Microsoft, Oracle

I see the large established players like Amazon, Apple, Google and Oracle growing and doing well in 2010 however I see Microsoft slowing down in 2010. I think that Amazon, Apple, Google and Oracle are poised to take advantage of the web as all are nimble enough to shift direction if needed and all appear to have new products and services in the pipeline. Windows 7 and Microsoft Sharepoint will the the two high points for Microsoft while overall sales will decline.

I see Google as the big winner here. They are embracing the Open Web, and moving out of their comfort zone of search with new voice and social applications and of course the rumored Google Phone.

Google’s Meaning of Open

Jonathan Rosenberg, Senior Vice President, Product Management at Google sent a long essay to the Google Product Managers and Engineers in an effort to put some clarity around the meaning of “Open” at Google. The well-written essay called “The Meaning of Open” ended up on the Google blog and is well worth reading.

Jonathan outlines Open at Google as …..

There are two components to our definition of open: open technology and open information. Open technology includes open source, meaning we release and actively support code that helps grow the Internet, and open standards, meaning we adhere to accepted standards and, if none exist, work to create standards that improve the entire Internet (and not just benefit Google). Open information means that when we have information about users we use it to provide something that is valuable to them, we are transparent about what information we have about them, and we give them ultimate control over their information. These are the things we should be doing. In many cases, we aren’t there, but I hope that with this note we can start working to close the gap between reality and aspiration.

and he goes on to make points like these ….

If we can embody a consistent commitment to open — which I believe we can — then we have a big opportunity to lead by example and encourage other companies and industries to adopt the same commitment ….

whenever possible, use existing open standards. If you are venturing into an area where open standards don’t exist, create them. If existing standards aren’t as good as they should be, work to improve them and make those improvements as simple and well documented as you can …..

We believe in the power of technology to deliver information. We believe in the power of information to do good. We believe that open is the only way for this to have the broadest impact for the most people. We are technology optimists who trust that the chaos of open benefits everyone. We will fight to promote it every chance we get ….

The future of government is transparency. The future of commerce is information symmetry. The future of culture is freedom. The future of science and medicine is collaboration. The future of entertainment is participation. Each of these futures depends on an open Internet ….

I give Jonathan credit for jumping in and offering an opinion and a direction for the Googler’s while at the same time welcoming comments and differences of opinion. I applaud the effort and feel that Google should lead by example. I also like the fact that Jonathan identified Open Information as well as Open Technologies because I feel that Open Information/Data is often missed in discussions about the Open Web.