In 1999, Tim Barners-Lee stated: “I have a dream for the web [in which computers] become capable of analyzing all the data on the web. The content, links, and transactions between people and computers. A “Semantic Web”, which should make this possible, has yet to emerge…”
Since then, the idea of a semantic web has helped generate and sustain our expectations around the web 3.0.
Eighteen years later, I wonder if we have finally realized that dream.
Are we in the web 3.0 era?
Have we crossed the line where computers, and their software, are really capable of analyzing all of the data on the web?
My answer is yes. And if we haven’t completely crossed that line, at least we have a steady foot on it.
If you think about it, today’s computers now have the speed and the power to face the problem of the huge amount of information available on the web. However, we know that being fast and powerful it is not enough. Even if gathering and analyzing all of that data is quite an achievement on its own, making sense of it is the eventual goal.
As we all know, to make sense of something requires that we’re able to fully understand it. This has always been a concern of the web 3.0. A semantic web should gather and categorize all of the information in a way that can be comprehensible for both computers and humans. But since they don’t share the same language, how can they understand the information on the web in a common way?
Expert System responded to this question with the Cogito Cognitive Technology.
In Cogito, semantic technology combined with the machine learning approach can make computers understand information as a human does. Its unique ability to mimic the human way of thinking is based on a deep and wide representation of knowledge, the Cogito Knowledge Graph. This is what enables Cogito to read, comprehend and learn like we do.
If crossing the line of web 3.0 was yet a dream in ‘99 because we were waiting for a true semantic web, today, thanks to new adaptive solutions like Cogito, we can see that dream come true.