The more data Artificial Intelligence has the more accurate are the outcomes and predictions. Due to the foundations of internet architecture and underlying nature of the Internet, we are limiting the type of data algorithms have access to and how they access them.
The centralised approach of the Internet limits how we transfer, authorise and access data across networks. It is very difficult to duplicate multiple copies of terabytes/petabytes of data across internet networks to access and process data.
The decentralised interconnectedness of blockchains provides a new way to connect data without the overheads of trust, security and controls. Blockchains provide human-to-human or machine-to-machine trust without any of the parties needing to know or trust each other.
The consequences of existing Internet architectures is fundamentally going to change. For the first time, we can honour the original vision of the internet but based on blockchain technology, for all peers to trust each other.
For the first time we can honour the original vision of the Internet to create an open trusted network for people, machines and data to operate, but without the original flaw of having to know everyone in the network.
For the first time we can trust the network without the need to know and trust each other.
The original vision of ARPANET was a closed network where every member (node) knew exactly who the other party was. With the immense expansion of the internet, more and more networks and people started connecting, naturally everyone no longer knew everyone else. The internet became and stayed untrusted.
We had to build firewalls, harden operating systems, create internal networks that sat behind external facing gateways just to protect internal users and data. We had to build controls on top of controls to detect, protect, respond to threats.
The history of two technologies of the core of the next human advancement have take their own paths and evolution to reach the next stage. I foresee the evolution of technological and human evolution converged.
Up until now we have been working in advancing and evolving our current technology in silos. The advancement of machine learning and AI has stemmed in the various Bayesian, Symbolists, Connectionists and Evolutionaries tribes working almost in isolation to advance their particular areas of work.
Artificial intelligence was borne out of the need to win a war. Computing in principle was not developed to break code, computing was simple and calculations were made to solve mathematical and computational challenges.
In trying times, such as war, the motivations and incentives became a catalyst to make the evolutionary step such as nature when animals became amphibious
We foresaw the thinking computer. The Turing test was developed to validate the ultimate goal and progress in people not knowing the difference between man and machine.
That has been the holy grail that has been sought by computer scientists to accomplish what was envisaged all those years ago and we’ve made progress.
Today we have developed mechanisms and machines that are capable of accomplishing such complex transactional calculations which were impossible to achieve.
The advancement in field of machine learning stemmed through a rapid progress of statistics, data science and analysis fuelled by the gush of big data collected for anything
We have collected zettabytes of data by observing the interactions of the natural and logical worlds. Data is everywhere and we have woken up to understanding how we collect and now make use of it.
Think of all the intricate interactions in nature, every single movement, motion, action, reaction creates a data point that we’ve only just started to learn to catalogue and collect.
Topped with the advancement in data analysis methods we have created new ways to wholly analyse vast amounts of data in a simplified and accurate way. The idea of machine learning was only recently exploded using an evolution of pre-existing methods.
We have identified ways in which data can be analysed using the evolution of Bayesian statistics, fuzzy logic etc and developed into new ways such as Neural networks, recurring and convoluted NN, regression, clustering and segregation
This has lead to new ways in doing analysis that can be supervised, semi-supervised or unsupervised in the fields of machine learning and deep learning
Our technology and approach has resulted in new technology such a vision, autonomous driving, prediction and even learning (Go). We’re just at the beginning of what we can accomplish.
The technology behind Blockchain stemmed from cryptography and security. The mechanisms of private key infrastructure have long been used in security for authentication and authorisation.
Blockchain provides us the map and ability to map and understand data, transfer of data without the need to trust or know the other parties.
The technology provides immutable proof that transactions or data stored in a ledger are valid and can be trusted as truth between parties. This validation is done through mass computing power, called miners, used to calculate and validate the cryptography within in block is secure and the integrity of it linked to previous blocks – creating a chain. This is the consensus algorithm used to accept the validity of the data and crypto.
For AI to be effective, it needs a constant stream of data to process to improve its algorithms, accuracy and outputs.
For the first time, we are opening up and providing unrestricted access to huge datasets for processing without the limitations and restrictions of internet architectures.
We can create new AI agents that are no longer restricted to closed internal networks or clouds but able to openly and autonomously traverse blockchain networks to identify and access data.
We’ve seen a few early attempts of this in the DAO and Ethereum smart contracts that attempted to run autonomous code backed with crypto currency to perform tasks and potentially accumulate further currency. They failed not only to the immaturity of its application and code, but due to the limitations of its environment. The code was not truly free in the sense of the word but it was bound and restricted to its surrounding environment
But the true Turing test of Autonomous AI on Blockchain is the ability for the agent to live entirely autonomously on a decentralised internet.
In order to achieve this fundamental change in our architecture, we have to tackle the challenge one piece at a time. We can’t make a complete change to the internet overnight without disrupting the nature of the networks.
Our unique approach has been to overlay on top of existing and future blockchains, that themselves sit on traditional internet foundations. We need a virtual decentralised internet that works on top of the existing architecture and provides a new way to connect and interact.
We have the same issues in the blockchain space as we did with the early internet networks. We have independent closed networks that don’t connect to each. As soon as the internet networks connected, thanks to the adoption of TCP/IP (transmission control protocol/internet protocol) as the Standard to enable flows between networks created the modern internet of today.
We are experiencing the same challenges today with blockchain. As soon as we connect the various closed blockchain networks to each other we will see the emergence of a new blockchain internet.
It was this idea that I used to create the genesis of the blockchain ISO Standard TC307.
What we have done in Quant is create a blockchain operating system called Overledger which sites on top of existing and future blockchains. The system allows for the ability to reach and write of data between various blockchains respecting the rules of each blockchain, such as consensus.
We’re excited to be able to shape the evolution of the two foundational technologies of AI and Blockchain and help foster innovation to develop new and revolutionary technologies.