Home » The era of data transaction 2.0 is coming, privacy computing makes data “available and invisible”

The era of data transaction 2.0 is coming, privacy computing makes data “available and invisible”

by admin
The era of data transaction 2.0 is coming, privacy computing makes data “available and invisible”


Author: Du Chuan

  [ 业内专家认为,目前数据交易在实践中还面临隐私保护、安全等多方面约束条件,要让数据来源可确认、使用范围可界定、流通过程可追溯、安全风险可防范,隐私计算等多种前沿技术提供了可行性路径。 ]

As a new factor of production, data is regarded as the “oil” of the digital economy era. Driven by policies related to data elements, data transactions are entering the 2.0 stage, and subdivisions such as data collection, storage, cleaning, analysis, management, and transmission will be further developed.

Recently, Shenzhen Data Exchange Co., Ltd. (hereinafter referred to as “Shenzhen Data Exchange”), together with 50 sponsors including national think tanks, national units, universities, large financial institutions, and large Internet companies, led the establishment of the first international open source independent and controllable privacy computing in China. The community “Open Islands (Open Islands) open source community” fully integrates the government, enterprises, universities, scientific research institutions and other resources in an open source and open way, and promotes the development of key basic technologies for the circulation of data elements.

Industry experts believe that data transactions currently face constraints in terms of privacy protection, security and other aspects in practice. It is necessary to ensure that data sources can be confirmed, the scope of use can be defined, the circulation process can be traced, security risks can be prevented, and many frontiers such as privacy computing must be ensured. Technology provides a viable path.

  The era of data transaction 2.0 is coming

The introduction of the top-level design will undoubtedly promote the legal and compliant circulation of data elements, and stimulate the vitality of the data element market.

At the beginning of this year, the State Council issued the “Overall Plan for the Comprehensive Reform of the Market-based Allocation of Factors”, which mentioned that it will explore the transaction paradigm of “the original data is not out of the domain, and the data is available and invisible”. On the premise of protecting personal privacy and ensuring data security, grading Classified, step-by-step and orderly promotion of the application of data circulation in some fields; the recently issued “Opinions of the Central Committee of the Communist Party of China and the State Council on Accelerating the Construction of a Large National Market” clearly put forward the requirements of “accelerating the cultivation of a unified technology and data market”; The Opinions on Building a More Perfect Market-Based Allocation System and Mechanism for Factors pointed out that the efficiency of factor allocation should be improved by promoting the autonomous and orderly flow of factors.

See also  Why the organic caterer Safran does not want to expand

Wang Teng, deputy general manager of Shenzhen Data Exchange, believes that building a unified data market can further improve the top-level structure and system design of data transactions, and promote the establishment of unified rules and standards, exchange and sharing transaction systems, and standardized operation mechanisms for data trading venues in various places. Promote cross-industry and cross-regional data resource flow and innovative applications.

Driven by policies related to data elements, a wave of data trading centers has been established in various places. As early as 2015, Guiyang established the first big data trading center in the country, which is regarded as the beginning of the era of data trading 1.0; since 2021, Beijing, Shanghai, Shenzhen, Guangzhou and other places have all planned to build data trading centers or put forward relevant plans. Among them, the Beijing International Big Data Exchange and the Shanghai Data Exchange have been officially launched, and the Shenzhen Data Exchange and the Guangzhou Data Exchange are under preparation, which are regarded as the advent of the era of data trading 2.0.

According to the report of Zero One Think Tank, since 2020, 16 data trading platforms have been established successively. As of March 2022, there have been 39 data exchanges initiated, led or approved by local governments across the country (excluding Hong Kong and Macao). ,tower).

Wang Teng believes that the industrial policies and laws and regulations of the data element market have been gradually improved, the digital application scenarios have become more abundant, the technology development has become more mature, and the form of data transactions has also changed. Data transaction is no longer a simple transfer of data ownership, but a transaction based on data usage rights for complex application scenarios such as multi-dimensional data fusion under the premise of protecting the rights and security of all parties in the data.

See also  Today's Stock Exchanges, May 2nd. Markets tested by First Republic's bankruptcy. Eyes on inflation and central banks

Privacy computing provides “technical solutions”

It is worth noting that in the process of promoting the cultivation of the data element market, the establishment of a data exchange is only the first step. As a new production factor, data also faces many constraints such as privacy protection and security compliance in transaction practice.

“Data transactions must first solve the problem of data sources.” Wang Teng said.

In 2021, the state has successively promulgated the Personal Information Protection Law and the Data Security Law. The top-level design of the data security legal system has been gradually improved, but the legal boundaries faced by enterprises in developing and utilizing data resources and participating in data transactions in accordance with the law are still uncertain. clear.

Wang Teng believes that how to open transaction data in compliance with laws and regulations, how to protect the rights and interests of enterprises’ own data, and how to deal with multi-departmental regulatory requirements are the main problems currently plaguing enterprises. In the absence of specific normative guidelines and effective security assessments, companies are reluctant to open data resources out of risk control considerations, nor dare to develop and utilize data resources of unknown origin. There is an invisible barrier between strong data demand and relatively insufficient data supply.

How to find a balance between data protection and the rational use of data value, break the barriers of data supply and demand, protect privacy security and the rights and interests of data subjects, promote data fusion applications, and provide a feasible path for various cutting-edge technologies such as privacy computing.

The so-called privacy computing refers to a collection of technologies that realize data analysis and computing on the premise of protecting the data itself from being leaked to the outside world, so as to achieve the purpose of “available and invisible” to the data; on the premise of fully protecting the data and privacy security, realize the value of the data. transformation and release.

Yang Qiang, academician of the Canadian Academy of Engineering and the Royal Canadian Academy of Sciences, chairman of the FATE Open Source Community Technical Steering Committee, and chairman of the Open Islands Community Technical Steering Committee, told Yicai that data is the most basic production factor in the digital economy. Technologies like private computing and federated learning can make data available invisible. The so-called invisible means that the data itself is authorized, and the data itself cannot be copied twice. At the same time, the knowledge contained within the data can be shared with others. It is communicable under the premise of security protection, auditability, and traceability.

See also  Two autonomous guards for the new bridge in Genoa

Wang Teng said that through privacy computing technology, model calculation and result output can be realized without contacting the original data, realizing “the source of data can be confirmed, the scope of use can be defined, the circulation process can be traced, and security risks can be prevented”.

Yang Qiang said: “We need such a technology, which is inclusive and can be used and learned by everyone. We have proposed the concept of ‘trusted federated learning’, which is a highly available technology, and we hope to pursue it from the beginning. From the ‘theoretical perfection’, to the current ‘practical perfection’. Based on this premise, the open source ecology makes such technologies and platforms available to everyone, helps to break the data and technology silos, and can concentrate the power of society to carry out technological innovation, Carrying out inclusive market education on technology will help us develop more digital and data talents.”

It is reported that the Open Archipelago open source community relies on data transaction scenarios, open source and open production and collaboration methods to break through the silos between data, technology, platforms, and institutions to achieve cross-regional, cross-regional, and cross-platform interconnection and interoperability to serve the national data element circulation application. Scenarios are the goal to help accelerate the construction of a unified large market for national data transactions.


You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy