Since the popularity of Bitcoin and Blockchain, there has been a word that has become popular. It is called computing power.
However, the computing power is not dedicated to mining after the rise of Bitcoin. The deep learning that has gradually emerged in the same period is also due to the rapid development of computing power.
To put it simply, computing power is calculate ability, and computing power is not a new thing. Since there is a computer, there is computing power, but it has not been called computing power before, it was called computing speed.
For example, a CPU processor can perform 5000 additions per second. This is the computing power. The more times you can perform calculations per second, the faster the calculation speed and the stronger the computing power.
The computing power of a single computer depends on the CPU’s dominant frequency. The higher the dominant frequency, the stronger the computing power, CPU is a certain type of processor, there is also various of other chips can do computing. For example, a graphics card (GPU) can also do computing, but its power is not measured by the frequency of the dominant frequency, but by the parallel processing power, the more paralleled processors you get, the stronger computing power you have.
Bitcoin Speed up Computing Power
With the development of computer technology, the form of computing power is constantly changing, and the form is more and more diverse.
The earliest computing power is single-core CPU. The core indicator is the CPU frequency. The higher the frequency, the stronger the computing power;
Then there appeared a dual-core CPU, 4-core CPU, 8-core CPU and other multiple cores CPU. At this time, the key index should also consider the cores of the CPU, that is, the parallel computing power. Under the premise of the same dominant frequency, the more cores, the stronger the computing power;
However, even if the CPU has more cores, it is still carried out in a central processing.
With the development of the Internet, the content of the Internet has gradually changed. Previously, it was based on words, and gradually evolved into a large number of pictures and videos (plus the development of the game industry). The CPU is not qualified for the real-time task which the amount of concurrency is huge, but the single-operation requires less computational requirements. For example, in image processing, the processing job required for each pixel is very few, but each image includes millions of pixels. If you use the serial processing mode of the CPU, then an image processing takes a long time. If you can process these pixels at the same time, then you can meet the real-time requirements.
In this background, the GPU has emerged, which is known as the graphics card. The birth of the graphics card can be said to be dedicated to the processing of pictures (or games).
Following this development trajectory, Bitcoin mining has also been digging from CPU mining, to GPU mining, to dedicated GPU mining, and later specially designed ASIC mining machines, and then to the mining pool. The route of the Bitcoin mining, of course, such a development route is actually caused by the increasingly fierce competition of miners. If Bitcoin is getting worse, it is believed that the route of mining may be reversed until the Bitcoin disappears.
However, it has to be acknowledged that the competition for Bitcoin mining has promoted the development of GPU chips. Later, the emergence of ASIC mining machines has also promoted the development of ASIC chips, which are considered to be the core of future AI chips.
After the madness of 2017, the bubble of Bitcoin and blockchain gradually began to burst, many Bitcoin mines gradually closed down, and many miners also withdrew from the mining.
Coincidentally, the re-emergence of artificial intelligence coincides with the outbreak of Bitcoin.
As a deep learning of the core technical strength of the current artificial intelligence field, the required computing power is exactly the kind of massive parallel computing required by the mining machine. However, the development of artificial intelligence, after experiencing the low tide period of the S-shaped development curve, ushered in an outbreak, only going forward, not going backwards.
The artificial intelligence took over the baton of the computing power and continued to accelerate it.
Computing Power Preposition—Edge Computing
It must be said that the contribution and development of bitcoin and blockchain for computing power is huge, but you may not think that the distributed nature of bitcoin and blockchain once again brings an opportunity for the development of artificial intelligence.
In March of 2018, Organized by Hikvision, the 2018 AI Cloud World Summit was held at Hangzhou International Expo Centre.
In the summit, President of Hikvision, Yangzhong Hu said:
“A large number of application requirements are surging in a fragmented way. The implementation of AI is restricted mainly by three factors – data, computing power and application. “
Computing power requirement is being needed in a decentralized way nowadays and future.
Later on in this summit, Dr. Shiliang Pu, President of Hikvision’s Research Institute, pointed out that the edge perception and its ability to generate multi-layered cognition to empower AI applications is very important.
Hikvision’s summit indicates that: Injecting AI power into the edge, empowering edge intelligence is the trend of the future.
According to the trend of “Edge Computing” published in the 2019 Technology Trend Report released by EO Intelligence in 2019, it is pointed out:
With the maturity of 5G and artificial intelligence, cloud servers need to access more devices and process massive amounts of data. Low-latency and high-traffic computing load pressure make this centralized data processing model unsustainable. Therefore, it is necessary to set up a local distributed network device at the end close to the data to provide processing, storage and other capabilities to improve the performance and reliability of the application work. Lighten the central server load.
And this architecture is called edge computing.
Based on this background, it is not surprising that the most important application scenario in CCTV: face recognition, asking for marginalization.
This is precisely the development trend of decentralization of computing power, which coincides with the decentralization of Bitcoin.
Computing Power Postposition—Centralized
There is a computing power preposition, and there is also a computing power postposition, which is the centralized data processing mode mentioned above.
At present, the mainstream face recognition solution is still computing power postposition. The front-end face IPC only performs face detection and capture, and then passes the face image to the back-end server for comparison and identification (ie, data processing). This kind of method will bring more and more load pressure to the back-end server with the gradual expansion of the system. However, for small and medium-sized face recognition application scenarios, computing power postposition is still the main deployment method.
Computing power postposition is not always a bad method. For example, the back-end server is generally in the computer room. It can be well protected and not vulnerable to attack. But the front-end computing device is often exposed to the outside, which is vulnerable to attacks and data leakage.
Edge Computing Architecture
Hikvision introduced AI Cloud in October 2017 at the China Public Security Expo in Shenzhen. AI Cloud is a development concept born out of the Internet-of-Things (IoT) era.
In this AI Cloud, edge computing is being included and described as below:
It is a three-layered architecture that incorporates cloud and edge computing to provide multi-dimensional perception and front-end processing at Edge Nodes, and then process data in real-time and converged to Edge Domains for intelligent applications and create new data, and further converge on-demand data to the Cloud Center for big data analysis.
Edge Computing Industry
With the development of cloud computing and computing power, edge computing has become a trend of computing technology in the future. Many companies that focus on edge computing have been born, and many representative edge computing products have also been born.
Internationally, there are mainly Amazon, Microsoft, Google and other giants, as well as Intel, Cisco, Hewlett-Packard, IBM and other follow-ups.
In China, there are mainly representative companies such as BAT(Baidu, Ali, Tencent), Huawei, China mobile operators, and other network technology companies.
However, edge computing has not yet formed a unified standard and agreement, so major companies are taking the lead in forming alliances to promote standardization.
Huawei, Shenyang Institute of Automation, Chinese Academy of Sciences, and Intel jointly launched the Edge Computing Industry Alliance in 2016 to promote research collaboration and application incubation and build an industrial ecosystem.
The Future of Edge Computing
As the computing power of the chip becomes stronger and stronger, the sensing device at the edge of the system also has more powerful computing and storage capabilities. Therefore, the edge of computing power must be the future development trend, and the edge of computing power will also become wider and wider. The scope of use, artificial intelligence, Internet of Things, automatic driving, etc., will increasingly rely on edge computing, which will be a huge industry, with a lot of business opportunities, we must not ignore edge computing.
Our future will be surrounded by computing, and in the future, computing will be everywhere.