Data Tokenization
Description
Louise Borreani and Pat Rawson:
"Data tokenization is the process of converting data of any sort into tokens that can be securely transferred without revealing the original data. This process enhances “data security, privacy, and compliance while preventing unauthorized access and misuse.” Datatokens are a new Web3-native asset class, finding no equivalent in the traditional world where the technical access conditions of intellectual property (IP) are embedded into its asset form. All forms of IP, from “patents, datasets, or contractual agreements” are being wrapped inside tokens like IP-NFTS, “enabling easy transfer and collective ownership over such assets.”[47] IP-NFTs are an increasingly popular token model implemented in the decentralized science community, having already proven their usefulness as investment instruments in the biotechnology and longevity sectors."
(https://mirror.xyz/ecofrontiers.eth/zkh2LoADInAgr7GLbXnsuUOEcwJKFE4GuUSYuYU22io)