The post-Corona Era: from the Business Model Canvas to the Machine Learning Canvas with tokenized Big Data.
After the lockdown of physical businesses, most of those will not re-open due to losses incurred during the lockdown itself. That means that the new so-called "Coronapreneurs" will have to use real-time opportunities to capture "momentary markets", thus requiring businesses to update customer analytics and advertising. In particular, businesses will need to adapt to a transition from social, mobile, analytics and cloud to Big Data and Machine Learning. Personal profiling of consumers by analyzing their social profiles, where they check-in and what products they tag among other parameters will change marketing for these new businesses, which will thrive only if the owners invest to reskill the Next-gen workforce, which is to be trained in 2020 tech trends. Not to forget, though, is that ever since the European Union’s GDPR laid the foundation for data privacy regulations, all businesses, including new ones, will need to adapt their analytics to similar guidelines in 2020 to protect user information.
Machine Learning systems are complex. At their core, they ingest data in a certain format, to build models that are able to predict the future. Both sciences and industry are facing a data revolution and this has given rise to completely new data formats and databases of unprecedented scale. Such a rise in big data has presented an opportunity for big data and machine learning to come together and develop machine learning techniques that have the ability to handle modern data types, by drawing on statistical and computational intelligence for navigation of vast amounts of information with minimal or no human supervision.
A famous example in the industry is identifying fragile customers, who may stop being customers within a certain number of days (the “churn” problem). These predictions only become valuable when they are used to inform or to automate decisions (e.g. which promotional offers to give to which customers, to make them stay).
In many organizations, there is often a disconnect between the people who are able to build accurate predictive models, and those who know how to best serve the organization’s objectives. One way to make collaboration easier is to use a canvas, just like the widely-known Business Model Canvas. In the context of data and Artificial Intelligence, a canvas can be useful to describe the actual learning that takes place in intelligent systems:
What data are we learning from?
How are we using predictions powered by that learning?
How are we making sure that the whole thing “works” through time?
The Machine Learning Canvas allows us to describe precisely this.
In particular, the part on the right-hand side is dedicated to Learning from data, including data sources, which are usually the hardest to find. Data is indeed the oil of the new economy. In the context of any given digital application, data is where the value resides: for the companies that are paid to host it; for the platforms that are able to sell advertising against it; and for the users who effectively trade their data for reduced-priced services. Data is, in other words, an asset. Like other assets, it can be tokenized and decentralized into a public blockchain. It’s not hard to imagine a future of every meaningful piece of data in the world will be represented by a private key.
Tying tokens to data explicitly creates a world of new options to reconfigure how data can be acquired and used, safely, to educate machine learning predictive systems. Thanks to Blockchain and tokenization, it is nowadays possible to think of a cryptocurrency whose value is based on the request and offer of the traffic of the data generated and collected. In such an ecosystem, data collection supports the currency itself, because the ones who want to buy it are obligated to pay with the coins purchased on the market, growing the trading volume, thus favouring the same owner of the coins. The market has a very high demand for data: the expectation for the 2020 total value of Big Data exchanged, in the global market, is around 465 billion dollars. DTCoin is making this tokenization model an established reality in the data and Artificial Intelligence fields, presenting itself not only as an innovation but, most importantly, as a real-life aid to the businesses in the post-Corona era, so that they can start building a Machine Learning Canvas without having to worry about where to get data and how to treat it securely.