Monday

Cloud Computing new Web 2.0 paradigm

Cloud computing is a computing paradigm in which tasks are assigned to a combination of connections, software and services accessed over a network. This network of servers and connections is collectively known as "the cloud." Computing at the scale of the cloud allows users to access supercomputer-level power. Using a thin client or other access point, like an iPhone, BlackBerry or laptop, users can reach into the cloud for resources as they need them. For this reason, cloud computing has also been described as "on-demand computing."

This vast processing power is made possible though distributed, large-scale cluster computing, often in concert with server virtualization software, like Xen, and parallel processing. Cloud computing can be contrasted with the traditional desktop computing model, where the resources of a single desktop computer are used to complete tasks, and an expansion of the client/server model. To paraphrase Sun Microsystems' famous adage, in cloud computing the network becomes the supercomputer.

Cloud computing is often used to sort through enormous amounts of data. In fact, Google has an initial edge in cloud computing precisely because of its need to produce instant, accurate results for millions of incoming search inquries every day, parsing through the terabytes of Internet data cached on its servers. Google's approach has been to design and manufacture hundreds of thousands of its own servers from commodity components, connecting relatively inexpensive processors in parallel to create an immensely powerful, scalable system. Google Apps, Maps and Gmail are all based in the cloud. Other companies have already created Web-based operating systems that collect online applications into Flash-based graphic user interfaces (GUIs), often using a look and feel intentionally quite similar to Windows. Hundreds of organizations are already offering free Web services in the cloud.

In many ways, however, cloud computing is simply a buzzword used to repackage grid computing and utility computing, both of which have existed for decades. Like grid computing, cloud computing requires the use of software that can divide and distribute components of a program to thousands of computers. New advances in processors, virtualization technology, disk storage, broadband Internet access and fast, inexpensive servers have all combined to make cloud computing a compelling paradigm. Cloud computing allows users and companies to pay for and use the services and storage that they need, when they need them and, as wireless broadband connection options grow, where they need them. Customers can be billed based upon server utlilization, processing power used or bandwidth consumed. As a result, cloud computing has the potential to upend the software industry entirely, as applications are purchased, licensed and run over the network instead of a user's desktop. This shift will put data centers and their administrators at the center of the distributed network, as processing power, electricity, bandwidth and storage are all managed remotely.
courtsy:http://searchenterprisedesktop.techtarget.com

No comments:

LLM for Humanoid Robot

  Photo by Tara Winstead Let's consider a scenario where we aim to integrate Long-Term Memory (LLM) into a humanoid robot to enhance its...