No.9 Gangzhong West Road. Changde, China
The Internet of Things is pushing the data center to the edge
A long time ago, the IT industry has discovered a huge data iceberg that is approaching. With the rapid growth of the Internet of Things (IoT) Internet devices, there has been a conflict between the two. Now, it must be the time to take action. Edge computing is probably the best chance for us to solve the impending data overload crisis.
Just one day today, the world has added 2.5 Bytes (EB, Exa Byte) data. This amount of data is amazing, but compared to the future, it is only a small witch. According to forecasts, by 2025, the amount of vehicle cloud data transmission per month will reach 10 exabytes - just a single application's data transfer volume, not to mention the new applications that we may not have thought of. .
So, how can we avoid the data center being overwhelmed by the invisible information iceberg of the Internet of Things? Think about it, the cloud connection is not necessarily fast enough, and even if it is fast enough, the backend does not necessarily have the ability to process IoT-scale data. So, we will soon know that in the future, we will be forced to choose the data we want to upload to the cloud. As Cisco's Helder Antunes once said: "In most cases, all data will be uploaded to the cloud through the powerful and stable bandwidth of the cloud and edge devices - this is simply not practical."
The beginning of edge computing: Content Delivery Network (CDN)
Edge computing is a hot topic in the near future, but if we expand its definition a bit, we will find that edge computing is not a new technology. For example, in the 1990s, there was a content delivery network like Akamai (Akamai). CDN). CDN is a way to transfer static content to multiple locations so that the information is closer to where it is actually used, and it has proven to be a very efficient mode for delivering content such as streaming video. Edge operations are built on this approach, but edge operations allow individual nodes at the edge of the network to store and deliver content services, as well as receive and process data.
After the emergence of cloud computing, such intelligent nodes exist between users and cloud data centers, and began to greatly promote technology development. This mashup approach allows us to increase data protection and reduce internal latency deployment solutions while leveraging cloud flexibility and accessibility.
After the emergence of various new applications, the way we move and process data has also created new demands. Automated driving cars, smart homes, and smart manufacturing are all related to online devices that generate a lot of useful data, but they can't help to avoid traffic jams, preventive maintenance of plant equipment, or receive SMS notifications from their daughters. . These all require some type of processing work - perhaps real-time analysis, machine learning or other types of artificial intelligence.
Currently, such processing operations are mainly performed by large data centers. However, as the demand for computing resources increases, this model cannot meet the demand. These use cases will generate more data and require faster response times beyond cloud computing power.
Of all current data, only 10% are processed outside the cloud or data center, and Gartner predicts that by 2022, 50% of all data will be processed elsewhere (ie, edge). Another trend that helps define the future is that many countries, including the US, have upgraded to 5G mobile networks. The telecommunications industry is deploying or actively considering the inclusion of adjacent edge microdata centers in the new 5G mobile base station when upgrading the network.
The edge becomes a battleground for the military
There are a variety of new technologies competing for devices between the device and the cloud to gain a favorable position, including Microsoft Azure's IoT Edge, AWS Greengrass, AWS Lambda, and Akraino Edge Stack, which Intel is helping to develop.
Horizontal or vertical?
Among these upcoming edge technologies are vertical solutions covering Internet of Things (IoT) devices, edge servers to the cloud, and horizontal practices that focus on integrating edge computing into a wide range of devices, such as applications executing inside containers. Programs, or deploy hypervisors on various devices within a specific infrastructure.
Pros and cons
By placing storage devices and computing resources close to the data source, latency and the required cloud bandwidth can be greatly reduced. What about security? This is an important consideration in relation to each application. Some experts believe that edge computing can limit the amount of data transfer on the open Internet, thus improving security, especially for companies that are not allowed to take sensitive data off the site. However, on the other hand, the increasing number of IoT devices and individual partitions of the infrastructure hierarchy (including edge servers) provide a larger attack surface for those who are interested, because each new endpoint may make the cloud The security is compromised and the path to the core network is provided.
Another issue that moves the infrastructure from the cloud to the edge is the issue of ownership, operations, and maintenance. Part of the reason the company embraces cloud computing is that the cloud can significantly reduce the burden of management and maintenance. If the computing infrastructure goes from the cloud to the edge, who will own it and operate it?
At present, it is difficult to get a clear answer to these questions, because I don’t know which technology will win in the end. There is only one thing that can be affirmed: no matter who is in charge of the future edge data center, and regardless of the size of the facility, efficiency will be the key. Providing operational efficiencies beyond large data centers is not an easy task, as is reliability. For the above reasons, there is no single solution that satisfies the edge computing use case for each client, and each edge facility will need to be customized for individual applications. Regardless of size, as data begins to be delivered to the edge, future data center infrastructure—especially power and cooling—will be key to success.
No content information display available
Please add data record on website background.