We are in the process of defining a new way of doing machine learning, focusing on a new paradigm, the data fabric.
In the past article I gave my new definition of machine learning:
Machine learning is the automatic process of discovering hidden insights in data fabric by using algorithms that are able to find those insights without being specifically programmed for that, to create models that solves a particular (or multiple) problem(s).
The premise for understanding this it’s that we have created a data fabric. For me the best tool out there for me for doing that is Anzo as I mentioned in other articles.
You can build something called “The Enterprise Knowledge Graph” with Anzo, and of course create your data fabric.
But now I want to focus on a topic inside machine learning, deep learning. In another article I gave a definition of deep learning:
Deep learning is a specific subfield of machine learning, a new take on learning representations from data which puts an emphasis on learning successive “layers” [neural nets] of increasingly meaningful representations.
Here we’ll talk about a combination of deep learning and graph theory, and see how it can help move our research forward.
Set the basis of doing deep learning on the data fabric.
Describe the basics of deep learning on graphs. Explore the library Spektral. Validate the possibility of doing deep learning on the data fabric.
If we can construct a data fabric that supports all the data in the company, the automatic process of discovering insights through learning increasingly meaningful representations from data using neural nets (deep learning) can run inside the data fabric.