The Integration of AI and Blockchain: A Comprehensive Analysis from Industry Chain to Tokenomics

The Integration of AI and Blockchain: From Zero to Peak

The recent rapid development of the artificial intelligence industry is regarded as the Fourth Industrial Revolution. The emergence of large language models has significantly improved the efficiency of various industries, estimated to bring about a 20% increase in overall work efficiency in the United States. At the same time, the generalization ability of large models is considered a new software design paradigm. Compared to the precise code design of the past, today's software increasingly adopts generalized large model frameworks to support a wider range of input and output modalities. Deep learning technology has indeed ushered in a new wave of prosperity for the AI industry, and this wave has also spread to the cryptocurrency industry.

This article will explore in detail the development history of the AI industry, the classification of technologies, and the profound impact of deep learning on the industry. We will analyze the upstream and downstream of the deep learning industry chain, including GPUs, cloud computing, data sources, edge devices, etc., and examine their current development status and trends. In addition, we will fundamentally discuss the relationship between cryptocurrency and the AI industry, outlining the landscape of the AI industry chain related to cryptocurrency.

Newcomer Science Popularization丨AI x Crypto: From Zero to Peak

The Development History of the AI Industry

The AI industry began in the 1950s. To realize the vision of artificial intelligence, academia and industry have developed various implementation paths under different historical contexts.

Modern artificial intelligence technology mainly uses the term "machine learning", whose core idea is to enable machines to improve system performance through data iteration. The main steps include inputting data into algorithms, training models, testing and deploying models, and finally using them for automated prediction tasks.

Machine learning currently has three major schools: connectionism, symbolism, and behaviorism, which imitate the human nervous system, thinking, and behavior, respectively. Currently, connectionism, represented by neural networks, dominates and is also known as deep learning. The architecture of neural networks includes an input layer, an output layer, and multiple hidden layers. When the number of layers and neurons is sufficient, it can fit complex general tasks.

Deep learning technology based on neural networks has also undergone multiple iterations, from the earliest neural networks to feedforward neural networks, RNNs, CNNs, GANs, and finally evolving into modern large models such as the Transformer technology used by GPT and others. The Transformer technology is an evolutionary direction of neural networks, adding a converter to encode data from different modalities ( such as audio, video, images, etc. ) into corresponding numerical representations, which are then input into the neural network, thereby achieving multimodal processing capabilities.

Newbie Science Popularization丨AI x Crypto: From Zero to Peak

The development of AI has undergone three waves of technological advancement:

  1. 1960s: The development of symbolic technology solved the problems of general natural language processing and human-computer dialogue. Expert systems were born during this period.

  2. 1990s: The proposal of Bayesian networks and behavior-based robotics marked the birth of behaviorism. In 1997, IBM's Deep Blue defeated the world chess champion, which was seen as a milestone for AI.

  3. From 2006 to present: The concept of deep learning was proposed, and algorithms based on artificial neural networks have gradually evolved, from RNN, GAN to Transformer and Stable Diffusion, marking the heyday of connectionism.

Some landmark events in the AI field in recent years include:

  • In 2015, deep learning algorithms were published in the journal "Nature," causing a huge response in academia and industry.
  • In 2016, AlphaGo defeated Go world champion Lee Sedol.
  • In 2017, Google released the Transformer algorithm paper, and large-scale language models began to emerge.
  • From 2018 to 2020, OpenAI released the GPT series models, with an increasing number of parameters.
  • In January 2023, the GPT-4-based ChatGPT was launched, reaching 100 million users in March, becoming the fastest application to reach 100 million users in history.

Newbie Science Popularization丨AI x Crypto: From Zero to Peak

Deep Learning Industry Chain

Current large language models mainly use deep learning methods based on neural networks. Large models represented by GPT have triggered a new wave of AI enthusiasm, with a large number of players flooding into this track. The market demand for data and computing power is growing rapidly, so we will explore the composition of the industrial chain of deep learning algorithms, as well as the current situation, supply and demand relationships, and future development of the upstream and downstream.

The training of large language models such as GPT(LLMs) is mainly divided into three steps:

  1. Pre-training: Input a large amount of data to find the best parameters for the neurons; this process is the most computationally intensive.

  2. Fine-tuning: Use a small amount of high-quality data for training to improve the model's output quality.

  3. Reinforcement Learning: Establish a "reward model" to rank output results, used for iterating large model parameters.

The three key factors affecting the performance of large models are: the number of parameters, the amount and quality of data, and computing power. Assuming the number of parameters is p, the amount of data is n( calculated by the number of Tokens), the required computing power can be estimated using the rule of thumb.

Computing power is generally measured in Flops, representing one floating-point operation. According to the rule of thumb, pre-training a large model requires about 6np Flops. The inference process, where the input data waits for the model output, requires about 2np Flops.

In the early days, training was mainly performed using CPU chips, and later it gradually shifted to GPUs, such as NVIDIA's A100 and H100 chips. GPUs perform floating-point calculations through Tensor Core modules, and the Flops data under FP16/FP32 precision is an important indicator of the chip's computing power.

Taking GPT-3 as an example, it has 175 billion parameters and a data volume of 180 billion tokens. One pre-training requires about 3.1510^22 Flops, or 3.1510^10 TFLOPS. Pre-training GPT-3 once using an NVIDIA H100 SXM chip takes about 584 days.

It can be seen that training large models requires enormous computational power, necessitating the collaboration of multiple advanced chips. The parameter count and data volume of GPT-4 are ten times that of GPT-3, and it may require more than 100 times the chip computing power.

In large model training, data storage also faces challenges. GPT-3's data occupies about 570GB, and its parameters occupy about 700GB. The GPU memory is generally small, for example, the A100 has 80GB, which cannot accommodate all the data, so chip bandwidth needs to be considered. When training with multiple GPUs, the data transfer rate between chips is also involved. Sometimes, the bottleneck that limits training speed is not computational power, but rather data transfer speed.

The deep learning industry chain mainly includes the following several links:

( 1. Hardware GPU Provider

NVIDIA is in an absolute leading position in the field of AI GPU chips. The academic world mainly uses consumer-grade GPUs like the RTX series ), while the industrial sector mainly uses commercial chips such as H100 and A100. Google also has its own TPU chips, but they are primarily used for Google Cloud services.

As soon as the NVIDIA H100 chip was released in 2023, it received a large number of orders, resulting in supply shortages. By the end of 2023, the order volume for the H100 exceeded 500,000 units. To break free from its dependence on NVIDIA, Google took the lead in establishing the CUDA Alliance, hoping to collaboratively develop GPUs.

Newcomer Science Popularization丨AI x Crypto: From Zero to Peak

( 2. Cloud Service Provider

Cloud service providers purchase a large number of GPUs to build high-performance computing clusters, providing flexible computing power and managed training solutions for AI companies with limited funding. Mainly divided into three categories:

  • Traditional large cloud providers: AWS, Google Cloud, Azure, etc.
  • Vertical AI Cloud Computing Platforms: CoreWeave, Lambda, etc.
  • Inference as a Service providers: Together.ai, Fireworks.ai, etc.

) 3. Data Source Providers

Training large models requires massive amounts of data. Some companies specialize in providing training data for various industries, such as professional datasets in finance, healthcare, chemistry, and other fields.

( 4. Database Provider

AI training requires efficient storage and processing of massive amounts of unstructured data, which has led to the emergence of specialized "vector databases". Major players include Chroma, Zilliz, Pinecone, etc.

) 5. Edge Device

GPU clusters generate a large amount of heat and require cooling systems to ensure stable operation. Currently, air cooling is mainly used, but liquid cooling systems are gaining favor from investors. In terms of energy supply, some technology companies have started investing in clean energy sources such as geothermal, hydrogen, and nuclear energy.

6. AI Applications

The current development of AI applications is similar to the blockchain industry, where the infrastructure is crowded but application development is relatively lagging. The top ten monthly active AI applications are mostly search-related products, with fewer applications in social and other categories. The user retention rate of AI applications is also generally lower than that of traditional internet applications.

Overall, the deep learning industry chain is developing rapidly, but it also faces many challenges. The demand for computing power continues to grow, data and energy consumption are enormous, and application scenarios need further expansion. In the future, each link in the industry chain will continue to optimize and upgrade to support larger scale and more efficient AI model training and application.

Newcomer Science Popularization丨AI x Crypto: From Zero to Peak

The Relationship Between Cryptocurrency and AI

The core of Blockchain technology is decentralization and trustlessness. From Bitcoin as a peer-to-peer electronic cash system to Ethereum's smart contract platform, Blockchain is essentially a value network where each transaction is a value exchange based on the underlying token.

In traditional internet, value is converted into stock prices and market capitalization through indicators such as P/E. In the Blockchain network, native tokens represent multidimensional value, not only can they earn staking rewards, but they can also serve as a medium for value exchange, a medium for value storage, and consumables for network activities.

The importance of token economics lies in its ability to assign value to any function or idea within the network. Tokens enable value reconstruction across various segments of the AI industrial chain, motivating more people to delve into specific AI niches. At the same time, the synergistic effect of tokens enhances the value of infrastructure, creating a pattern of "fat protocols and thin applications."

The immutability and trustless characteristics of blockchain technology can also bring real value to the AI industry:

  • Achieving model training and inference under data privacy protection
  • Distributing through a global network and utilizing idle GPU computing power
  • Provide a reliable value discovery and exchange mechanism for all links of the AI industry chain.

In summary, token economics can promote the reshaping and discovery of value in the AI industry, while decentralized ledgers can solve trust issues and re-enable the flow of value globally. This combination will bring new driving forces and opportunities to the AI industry.

Newcomer Science Popularization丨AI x Crypto: From Zero to Peak

Overview of AI Industry Chain Projects in the Cryptocurrency Sector

GPU Supply Side

The main blockchain GPU cloud computing projects currently include Render, Golem, and others. Render, as a more mature project, is primarily aimed at traditional tasks such as video rendering and does not strictly fall under the AI category. However, the GPU cloud market can be applied not only to AI model training and inference but also to traditional rendering, reducing the dependency risk on a single market.

According to industry forecasts, the demand for GPU computing power will be approximately $75 billion in 2024, reaching $773 billion by 2032, with a compound annual growth rate of 33.86%. As the iteration of GPUs accelerates, the demand for shared GPU computing power will increase significantly, as a large number of outdated idle GPU resources will be generated.

![Newbie Science Popularization丨AI x Crypto: From Zero to Peak]###https://img-cdn.gateio.im/webp-social/moments-53c48daf49a3dbb35c1a2b47e234f180.webp###

Hardware Bandwidth

Bandwidth is often a key factor affecting cloud computing performance, especially for decentralized GPU sharing networks. Some projects like Meson Network attempt to address this issue by sharing bandwidth, but the actual effect is limited because latency caused by geographical location is still difficult to avoid.

data

AI data providers include EpiK Protocol, Synesis One, Masa, and others. Compared to traditional Web2 data companies, blockchain projects have advantages in data collection and can provide incentives for personal data contributions. Combined with privacy computing technologies such as zero-knowledge proofs, broader data sharing is expected to be achieved.

( ZKML

To achieve model training and inference under data privacy protection, some projects adopt zero-knowledge proof schemes. Typical projects include Axiom, Risc Zero, etc., which can provide ZK proofs for off-chain computation and data. The application boundaries of these general ZK projects are broader, making them more attractive to investors.

)

GPT-0.24%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 5
  • Share
Comment
0/400
MetaNeighborvip
· 15h ago
Essential AI for Cryptocurrency Trading, relying on AI for everything.
View OriginalReply0
ILCollectorvip
· 15h ago
Who says a bull run must not cut loss? Cutting loss is also a way to rise.
View OriginalReply0
ColdWalletGuardianvip
· 15h ago
GPU大哥又要To da moon咯~
View OriginalReply0
HodlOrRegretvip
· 16h ago
Alright, alright, they're promoting AI again. A bull is a bull, but the Mining Rigs are unsold.
View OriginalReply0
MEVHunterZhangvip
· 16h ago
Again 被 Played for Suckers by BTC
View OriginalReply0
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)