What is cognitive analytics?

The majority of data that organizations deal with is unstructured. Making sense of it—to make it available for your business priorities, namely, for decision making—is beyond our human capacity to do so at the scale of our information. Cognitive computing brings together a number of applications to reveal context and find answers hidden in large volumes of information. If you’re looking to apply intelligent technologies in your business, cognitive analytics is a great place to start. In this post we answer the question: what is cognitive analytics?

Since the now infamous artificial intelligence conference at Dartmouth College in 1956, intelligent technologies have been slowly gaining ground. By 2019, IDC projects that the worldwide market for cognitive software platforms and applications will grow to about $16.5 billion, up from $1.6 billion in 2015.

Intelligent technologies are uniquely positioned to address the new enterprise data reality of high volume, multi-source, primarily text-driven information. According to IDG’s “Big Data and Analytics: Insights into initiatives and strategies driving data investments, 2015”, enterprises that adopt data-driven projects use primarily unstructured data sources such as customer databases, email, transactional data, worksheets and Microsoft Word documents.

For many organizations, however, this is just the tip of the iceberg in terms of the available resources. Open source information available on the internet such as regulatory and patent information, census data and social media posts are a valuable part of the information ecosystem. Add in Internet of Things type data from sensors and you have a level of complexity that is both a challenge and an opportunity.

The risk of leaving this information on the table is too great; thanks to today’s abundance of fast, cheaper computing power, the application of intelligent technologies such as cognitive analytics is more accessible and more affordable than ever before.

What is cognitive analytics?

Cognitive analytics applies intelligent technologies to bring all of these data sources within reach of analytics processes for decision making and business intelligence. Here’s how:

What is cognitive analytics? Intelligent analytics

Cognitive analytics applies human like intelligence to certain tasks, such as understanding not only the words in a text, but the full context of what is being written or spoken, or recognizing objects in an image within large amounts of information. Cognitive analytics brings together a number of intelligent technologies to accomplish this, including semantics, artificial intelligence algorithms and a number of learning techniques such as deep learning and machine learning. Applying such techniques, a cognitive application can get smarter and more effective over time by learning from its interactions with data and with humans.

What is cognitive analytics? Analytics that puts data first

Cognitive analytics is a data forward approach that starts and ends with what’s contained in information. This unique way of approaching the entirety of information (all types and at any scale) reveals connections, patterns and collocations that enable unprecedented, even unexpected insight.

Applied in the enterprise, cognitive analytics can be used to bridge the important gap between large volumes of information and the need to make decisions in real time. A deep understanding of information helps companies draw from the wide variety of information sources in your knowledge base to improve the quality of enterprise knowledge, competitive positioning and provide a deep and personalized approach to customer service.

Better Decisions Faster

– In today’s fast paced Competitive Intelligence (CI) environment, timing is everything. With the Luxid Navigator, you have access to extensive and integrated information on trials, news, research, publications, patents, regulatory and proprietary data. With a comprehensive, global view of over 9 million key opinion leaders/rising stars, real-time competitive landscape, and hourly alerts, Luxid Navigator gives YOU the platform to ensure you are always one step ahead in the game.

Join us on August 24th at 12pm ET for this informative and engaging webinar.  You will learn how to:

 

Do not miss this unique opportunity to discover this user–friendly portal application that aggregates biopharma open data and offers best–in–class analytics and visualizations to help you find the answers you’re looking for.

Register for this free webinar here!

KDnuggests

– When it comes to business value and ROI, does machine learning live up tot he claims? We’ll explore a pure machine learning approach through the lens of a typical enterprise use case.

Today’s push behind machine learning and artificial intelligence is so powerful that it has led to some pretty high expectations for performance and deliverables. When it comes to business value and ROI, can it really live up to the claims? Evaluating the value of a full machine learning project requires moving past the hype and managing the realities of this evolving technology, one example at the time. We’ll look at the realities of a pure machine learning approach through the lens of a typical enterprise use case.

Read the article

Natural language processing (NLP) is the relationship between computers and human language. More specifically, natural language processing is the computer understanding, analysis, manipulation, and/or generation of natural language (according to dictionary.com).

Natural language refers to audible speech, as well as text and all languages are included. NLP systems capture meaning from an input of words (sentences, paragraphs, pages, etc.) in the form of a structured output (which varies greatly depending on the application). Natural language processing is a fundamental element of artificial intelligence.

What is Natural Language Processing? Approaches

Symbolic Approach: The symbolic approach to natural language processing is based on human-developed rules and lexicons. In other words, the basis behind this approach is in generally accepted rules of speech within a given language which are materialized and recorded by linguistic experts for computer systems to follow.

Statistical Approach: The statistical approach to natural language processing is based on observable and recurring examples of linguistic phenomena. Models based on statistics recognize recurring themes through mathematical analysis of large text corpora. By identifying trends in large samples of text the computer system can develop its own linguistic rules that it will use to analyze future input and/or the generation of language output.

Connectionist Approach: Most simply, the connectionist approach to natural language processing is a combination of the symbolic and statistical approaches. This approach starts with generally accepted rules of language and then tailors them to specific applications from input derived from statistical inference.

How Systems Interpret Language

Morphological Level: Morphemes are the smallest units of meaning within words and this level deals with morphemes in their role as the parts of which words are composed of.

Lexical Level: The lexical level examines how the parts of words (morphemes) combine together to make words and how slight differences can change the meaning of the final word dramatically.

Syntactic Level: This level focuses on text at the sentence level. Syntax revolves around the idea that in most languages the meaning of a sentence is dependent on word order and dependency.

Semantic Level: Semantics focuses on how the context of words within a sentence helps determine the meaning of words on an individual level.

Discourse Level: How sentences relate to one another. Sentence order and arrangement can affect the meaning of the sentences.

Pragmatic Level: Bases meaning of words or sentences on situational awareness and world knowledge. Basically, what meaning is most likely and would make the most sense.

Ultimate Goal

The ultimate goal of natural language processing is for computers to achieve human-like comprehension of texts/languages. When this is achieved, computer systems will be able to understand, draw inferences from, summarize, translate and generate accurate and natural human text and language.

Whether it’s artificial intelligence, cognitive computing, today’s most innovative technologies are working to take advantage—through understanding, predicting or replication—of human language and its by products to create applications that are increasingly innovative and intelligent. Teaching a computer to really understand human communication starts not with matching words based on how they appear, but on understanding the meaning of words; therefore, it starts with semantics.

What is semantics? If you Google “semantics”, you’ll find over 19 million results. Without getting too technical or academic, let’s look at a few examples that explains what semantics is, and how we apply it in understanding information.

What is semantics?

We can start by thinking of semantics as the “magic” that happens when people communicate and, most importantly, when they understand each other. This magic is actually a combination of being able to understand words and the relationships between words and phrases, along with our general knowledge and acquired experience using them. As adults, we no longer have to think about what the majority of certain words mean, we just innately know what they are, when to use them, how to apply them and who to use them with.

 

What is semantics? Context

For example, in a job interview, you (ideally) wouldn’t greet your potential employer by saying, “What’s up?” Our social conditioning tells us that we would only use such language in a specific social context (i.e. with friends or family). Based on our exposure to many different contexts, we have come to know what to expect and what people expect from us in various situations.

So, how do we know which words to put together to best convey our wishes, feelings, questions, and intent? And how do we know what others’ wishes, feelings, questions and intent mean?

 

What is semantics? Ingredients + procedure

One way to think about what semantics is…is as a recipe where ingredients + procedure = the dish you’re trying to make. The end result means that you don’t actually taste each individual ingredient bu the product of the combination of the ingredients. With a cake recipe, words are the ingredients, the rules of sentence structure are the instructions and the combination of them is the cake itself.

 

What is semantics? More than words

Similarly, language is composed of idiomatic expressions, slang and abbreviations that often makes it difficult to understand if you lack the contextual knowledge. Back to our recipe example, “it’s a piece of cake” may not literally refer to cake, but to something else entirely. These nuances can confuse both people and technology, especially in the application of question answering. For example, the question “what time is the match?” has a variety of answers (no matter what time it is): It’s at 4.15; it’s at a quarter past four; it’s just after 4 p.m.; it’s at 15 after 4, etc.

Comprehension is not just a matter of words. Words are the symbols that we use to refer to things and words are also a means for conveying something. But, words alone can’t explain what semantics is. In order to fully understand other human beings we must rely on our innate capability to semantically discern and decipher.

 

What is semantics? Understanding intention

But, what if we don’t use words at all? Messages can be conveyed using facial expressions, posture, sounds (like groaning or sighing) and even via creative expression, such as through a work of art.
When we look at the Mona Lisa, for example, almost everyone would agree that it is a representation of a lady, but does it also mean that we understand the painter’s intentions? Unfortunately, no. The same thing happens with words; even if you have a rich vocabulary and vast worldwide knowledge, you can still misunderstand or misinterpret language.

To understand a work of art, you need to combine a number of types of knowledge and experience, including your subjective opinion, to come up with an interpretation. In the Mona Lisa, you might notice how her hands are folded, her smile (or is it a smirk?), but also the time period in which it was painted and where the artist was at the time he painted it–in other words, the painting’s context.

Similarly, when you consider words within a context you are able to understand both their meaning and the message -what they are trying to convey…That’s semantics!

Gazzetta di Modena

– Da ottobre inizierà un corso per venti cervelli che vivranno a stretto contatto per sei mesi. Il professor Colajanni:”L’obiettivo è creare nuovi sistemi di difesa”

Una scuola per hacker, dove, ben inteso, hacker è da intendersi come esperto di difesa informatica e non come criminale all’attacco dei siti di tutto il mondo. È la scelta, tanto rivoluzionaria quanto necessaria, che l’Università di Modena ha deciso di fare a fianco di alcune aziende del territorio, su tutte la Expert System.

Leggi l’articolo

“Melania Trump ha usato lo stesso linguaggio di Michelle Obama?” è questa la domanda che si è posta Expert System, società leader nello sviluppo di software semantici per la gestione strategica delle informazioni e dei big data, quotata sul mercato AIM Italia di Borsa Italiana, analizzando con Cogito i discorsi di Melania Trump e Michelle Obama. E la risposta al quesito è “no”, i due discorsi appaiono complessivamente molto diversi sia dal punto di vista linguistico sia sotto il profilo dello stile.

Sebbene nel discorso di Melania Trump il passaggio sui valori e sul futuro sia inequivocabilmente simile a quello presente nel discorso pronunciato da Michelle Obama a Denver nel 2008, lessico, registro linguistico, temi ed emozioni presenti nei due discorsi non sono equivalenti.

In generale, il vocabolario usato da Melania Trump appare più ricco rispetto a quello usato da Michelle Obama (ricchezza lessicale pari al 78.6% per la Trump contro il 48.47% della Obama); la sintassi della Trump appare più semplice (nelle frasi della Trump sono presenti circa 16 parole, in quelle della Obama circa 28) anche per questo il testo presentato dalla Trump risulta più leggibile rispetto a quello della Obama (facilità di lettura/comprensione del discorso di Melania Trump pari al 39.3% rispetto al 23.35% di quella di Michelle Obama).

Mentre speranza appare centrale nel discorso di Michelle Obama, l’idea di desiderio è al primo posto in quello di Melania Trump (in entrambi i discorsi, poi, sono ugualmente presenti amore e successo). Considerando più in dettaglio le frasi identificate da Cogito come le più rilevanti, si nota inoltre che entrambi i temi legati alla Presidenza (President) e al futuro della Nazione (USA, our Country) svolgono un ruolo centrale. Ma mentre lo stile di Melania Trump sembra più diretto e incisivo (Donald wants our country to move forward in the most positive of ways), quello di Michelle Obama appare più articolato, con l’uso di ripetizioni che trasmettono enfasi e chiarezza (Millions of Americans who know that Barack understands their dreams; Millions of Americans who know that Barack will fight for people like them; and that Barack will finally bring the change we need).

“I risultati della nostra indagine possono sembrare sorprendenti se si pensa alla bufera che si è scatenata su Melania Trump dopo l’apparizione alla convention di Cleveland”, ha dichiarato Luca Scagliarini, CMO di Expert System. “Eppure, il discorso dell’aspirante First Lady nel complesso appare molto diverso da quello tenuto da Michelle Obama a Denver nel 2008. Ma è proprio questo uno degli aspetti più interessanti della text analytics: analizzare linguaggio e informazioni non strutturate, testare le ipotesi e magari arrivare a dimostrare anche il contrario dell’assunto.”

Maggiori dettagli sull’analisi eseguita da Expert System applicando l’intelligenza di Cogito ai testi trascritti dei discorsi di Melania Trump e Michelle Obama sono disponibili in un report pubblicato qui.

Expert System’s Independent Text Analysis of Melania Trump and Michelle Obama’s Speeches Confirms Strong Linguistic Differences

The following report was developed by Expert System using Cogito cognitive technology to analyse Melania Trump and Michelle Obama’s speeches.

The data used for this document was taken from transcripts of Melania Trump speech (http://www.people.com/article/melania-trump-michelle-obama-similar-convention-speeches July 18 2016) and 2006 Michelle Obama speech (Source: http://edition.cnn.com/2008/POLITICS/08/25/michelle.obama.transcript/).

Expert System’s analysis offers only a quantitative analysis the extracted data; we just focused on transcript text, processed both speeches with our technology Cogito, and asked ourselves a simple question:

“From a linguistic standpoint, did Melania and Michelle say basically the same things?”

When we consider the complete speech, beyond the controversial passage, the answer is, “No”.

Download the report

 

The reality of today’s big data problems? More than just a buzzword, big data has become a major disruption that is now part of every organisation’s arsenal of tools for intelligence about every aspect of its business for decision making (at least it should be). However, no one said it would be easy. While the opportunity is clear, many organisations are still struggling with big data problems.

Big data doesn’t mean more accessible information

When you have a small amount of data, not only is it easier to manage and handle, it’s also easier to find what you’re looking for. However, the reverse is also true: the larger the scale of your big data, the noisier it will be and the more difficult it will be to distill the true signals from the noise and useless information. It’s not an uncommon problem; according to MIT Technology Review, “only 0.5 percent of all data is ever analyzed.”

No matter the size of your data repository, however, to tackle this big data problem and be able to access the information useful for your business, it helps to start with a clear and specific hypothesis of what you want to find.

Information is not the same as understanding

Another big data problem is that information does not equal understanding. Big data may reveal trends and correlation only between the variables you choose (but tell you nothing about the bigger picture). Another consideration is the problem of uncontrolled variables, because big data analyzes a small number of variables and it never tells us which correlations are meaningful. We have no idea if the relation is causal, if it is driven by a hidden variable, or if it is an accidental correlation. Big data can be easily gamed to create false positives with no actual meaningful connection between the variables. This leads us to another big data problem.

The bias in big data

A recent big data problem has emerged regarding bias in data. Studies from the Federal Trade Commission and universities such as Harvard and Princeton have highlighted the tendency of information to carry the biases of those compiling it. As the authors of the report “Big Data’s Disparate Impact” from Princeton’s Center for Information Technology Policy write, “An algorithm is only as good as the data it works with.”

You’ll want to be as objective as possible to ensure that your analysis is not carrying forward any company or personal biases where any results or signals that go against one’s experience or expectation may be discounted. Similarly, you’ll also want to have as much information as possible about all the variables in your business—both inside and in all of the markets you serve—so that you can better read the signals that you’re seeing in analysis.

Big Data, big security issues

As organizations collect ever more data, its security, and the compliance, legal, financial and reputational ramifications in the event of a breach, is a top concern. According to ZDNet, more than one billion personal records were illegally accessed in 2014. In 2015, the U.S. Office of Personnel Management alone had a number of breaches that resulted in the theft of data on 22 million employees, including the fingerprints of 5 million. The output of big data analysis is also at risk. According to the IDG Enterprise 2016 Data & Analytics Survey, 39% of respondents are protecting the this with alternate or additional security.

The contents of big data, from sensitive information about customers and R&D pipelines, not to mention the intelligence and personal data in government databases, is simply too important not to support it with strict security measures to ensure the integrity and safety of people and information.

Big Data could make the difference with the right metrics

Big data isn’t going away. With 51% of organizations preparing to increase their big data investment in the next two to three years (according to Forbes), working in advance to prepare for these big data problems will provide significant advantage in your ability to take advantage of the beneficial aspects of big data.

Expert System Deutschland GmbH is proud to integrate the program of SEMANTiCS, taking place in Leipzig University, September 12-15.

semantics 2016
Come hear Stefan Geissler, Managing Director of Expert System Deutschland GmbH, speak on “Hybrid semantic document enrichment using machine learning and linguistics – The Cogito Studio”.

Details
Session: WiFa SR 4 – Document Management & Publishing II
Date: Wednesday, Sept 14th

Learn more