– Dagli ultimi studi dell’Osservatorio Big Data Analytics e Business Intelligence emerge che il settore dei big data inizia a diventare il vero catalizzatore dell’attenzione delle grandi aziende, con una crescita del giro d’affari del 44% nell’ultimo anno. Va da sé che quando si ci interfaccia con il mare magnum delle informazioni diviene centrale la capacità di analizzarle nel modo più completo e preciso possibile. In questo contesto si colloca Expert System, software house modenese quotata sul mercato Aim di Borsa Italiana, specializzata in analisi e gestione delle informazioni non strutturate tramite un approccio semantico e che collabora con i servizi di intelligence.
Leggi l’articolo di Nicola Carosielli
– Ricerca su Twitter: nei commenti dominano i sentimenti negativi
La campagna sul referendum costituzionale è sempre più alta nei toni e avvelenata nelle parole. A certificarlo ora arriva anche l’analisi, condotta tra il 24 ottobre e il 29 novembre, su 120mila ‘cinguettii’ di utenti italiani di Twitter. La raccolta dei dati è avvenuta selezionando i principali hashtag riconducibili al tema del referendum, come ad esempio #bastaunsì, #iovotono, #iodicosì, #iodicono.
L’operazione è stata fatta da Expert System, attraverso una piattaforma di cognitive computing.
Leggi l’articolo di Fabio Florindi
To be competitive, companies need to be able to take advantage of current data to predict what might happen in the future. Predictive analytics plays a key role in being able to capture useful information and use it to model customer behaviors, sales patterns and other trends for the future.
Although, predictive analytics is usually related to data mining to describe how information or data is processed, there are significant differences between these techniques. Predictive analytics and data mining use algorithms to discover knowledge and find the best solutions. Data mining is a process based on algorithms to analyze and extract useful information and automatically discover hidden patterns and relationships from data. Instead, predictive analytics is closely tied to machine learning, as it uses data patterns to make predictions, where machines take historical and current information and apply them to a model to predict future trends. In essence, the difference between predictive analytics and data mining is that the former explores the data and the latter answers “What is the next step?”
Predictive analytics is the use of data, mathematical algorithms and machine learning to identify the likelihood of future events based on historical data. The main goal of predictive analytics is to use the knowledge of what has happened to provide the best valuation of what will happen. In other words, predictive analytics can offer a complete view of what is going on and the information we need to succeed .
Thanks to the diffusion of text analytics that have made the analysis of unstructured data less time consuming, predictive analysis is increasing. Today, we are increasingly looking to machines that can take past and current information to forecast future trends, such as sales trends for the coming months or years, or anticipating customer behavior such as in the case of fraudulent credit card use (learn more about how you can manage the operational risk here).
Predictive analysis uses various models to assign a score to data. The most common is the predictive model that is focused on the behavior of an individual customer. Using sample data with known attributes, the model is trained and is able to analyze the new data and determine its behavior. This information can be used to predict how the customer might behave next.
This is different from the descriptive model, which is used to classify data into groups. With descriptive models, customer data is classified by characteristics such as age or previous buying behavior. This information is often in marketing campaigns to hit a target group.
Today, more and more organizations are using predictive analytics to increase their business and:
Recognos Financial developed a new artificial intelligence platform “powered by Cogito” for real-time analysis of large volumes of financial data useful for decision-making processes
– There is continued and growing interest in Artificial Intelligence solutions for optimizing decision-making models that use and exploit unstructured information. Leading semantic data analysis provider Recognos Financial developed an innovative semantic platform for data management that incorporates the cognitive Cogito technology from Expert System.
The Recognos Financial platform uses Cogito within its machine learning workflow to analyze all types of text from diverse, evolving sources in the most complete and accurate way possible. For example, the data on the U.S. Securities and Exchange Commission website contains an enormous wealth of financial information, business documents, statements, forecasts, reports and prospectus from which real-time structured data may be extracted for further analysis or made immediately available for strategic decision-making activities of banks, investment funds, brokerage firms, etc.
Drew Warren, CEO of Recognos Financial stated, “Studies have shown that 80% of all data in a financial services firm is unstructured. In the evolving world of Big Data the primary goal is the aggregation and dissemination of data to facilitate better decision making. Recognos Financial is focused on leveraging the unstructured data in an organization to make this effort far more meaningful to a given firm.”
“In the financial market where the use of an ever increasing wealth of information is rapidly evolving, Artificial Intelligence offers the most convincing response to the issues of data quality,” commented Bryan Bell, EVP, Market Development for Expert System. “Cogito is distinguished by its ability to process information through the comprehension of meaning. It combines “intelligence” and power, flexibility, scalability and ease of implementation, while also offering significant cost-benefit advantages.”
Sono più di 30 le società che hanno incontrano la comunità finanziaria in oltre 130 meeting nell’ambito della Small Cap Conference 2016. Nella sede di Borsa Italiana, si rinnova l’appuntamento dedicato alle società di piccola e media capitalizzazione.
Ascolta l’intervista a Stefano Spaggiari, Amministratore Delegato di Expert System, alla Small Cap Conference 2016.
– Se ci mettessimo ad esaminare migliaia e migliaia di tweet, riusciremmo a capire se il popolo del web è orientato a favore o contro la riforma costituzionale? Qualcuno ci ha provato e ha concluso che fra il Si e il No al referendum c’è un sostanziale testa a testa. Con una rimonta del No nell’ultimo week-end. E c’è anche chi ha invece monitorato il “tono” del dibattito sul web – dai commenti sui siti a quelli su Facebook – notando che l’animosità delle conversazioni aumenta nel popolo del No. Ed evidenziando le figure chiave, fra cui un Massimo D’Alema che a sorpresa compare al terzo posto fra i personaggi più citati. Due società diverse che analizzano big data – Expert System e Reputation Manager – ci offrono spunti di riflessione interessanti in vista del confronto referendario di domenica.
When we talk about human capabilities for learning, it’s easy to understand how (A semantic development definition that I like refers to it as a “gradual process of acquiring the meanings associated with words”). When a child hears a word for the first time, he tries to understand the meaning using past experience, intellect, memory, etc. Children of up to one year old know the meaning of 50 to 100 words and their vocabulary grows rapidly from there. At five years of age, children can use about 2,000 words.
But how can we define semantic development when it comes to machines and not people?
For People, learning is a life long process. Humans have innate and intuitive capabilities that allow them to understand the meaning of words based on a variety of elements such as experience, memory, etc. A machine, however, does not have such a reference system to support it in this process of acquiring knowledge.
So how do machines learn? To be able to acquire knowledge, machines need a system that provides access to background information, and that allows them to cope with text ambiguities and meaning recognition. Ideally, this is done in a way that is similar to how a human learns. Such a system, combined with the machine’s memory and computing capabilities, results in a powerful module capable of logical and comprehensive text understanding.
To identify a proper semantic development definition in the case of machines, we have to start from a different element. At Expert System, we call it the Semantic Disambiguator.
The semantic disambiguator is a linguistic software module that is able to solve ambiguities and understand the meaning of each word in a text. This is possible thanks to multi-level text analysis and the semantic disambiguator’s interaction with a representation of knowledge where concepts are connected to one other by specific semantic relationships. This is our Knowledge Graph.
When the disambiguator comes across a new word (the Knowledge Graph is incredibly rich but it doesn’t know everything, just as a human can’t know everything), it tries to figure out the meaning by considering the context. To do so, it applies complex algorithms and heuristic rules to use the known words surrounding the unknown element.
Like a human, Expert System semantic technology acquires knowledge and learns automatically and from human experience. And, thanks to this multilevel text analysis, semantic technology allows users to reach accuracy levels over 90% in totally open contexts in order to fully understand the meaning of single words, sentences and entire documents.
Expert System parteciperà all’edizione 2016 del Premio Nazionale dell’Innovazione – PNI. Organizzato dall’Università degli Studi di Modena e Reggio Emilia, il PNI rappresenta la più importante competizione nazionale tra nuove imprese generate dal mondo accademico e della ricerca, volta a diffondere la cultura d’impresa in ambito universitario e a favorire il rapporto tra i ricercatori, il mondo dell’impresa e della finanza.
L’evento inizierà giovedì 1 dicembre alle 14,00 con l’apertura al pubblico dell’AREA EXPO allestita presso il Dipartimento di Ingegneria “Enzo Ferrari” di Via Vivarelli a Modena. A seguire, alle 15.00, il Convegno “Shaping the future, visioni di futuro”.
Nell’ambito del convegno “Shaping the future”, nella mattinata di venerdì 2 dicembre si svolgerà un workshop dedicato all’Area ICT e Industriali (ore 9:00, presso Tecnopolo, Via Vivarelli, 2 Modena) nel corso del quale Stefano Spaggiari, Amministratore Delegato di Expert System, parteciperà al dibattito su ricerca e innovazione, portando la propria esperienza imprenditoriale.
La finale e la premiazione del Premio Nazionale Innovazione 2016, con la conduzione di Riccardo Luna, giornalista e curatore di Idee NEXT, si svolgerà nel pomeriggio a partire dalle 14:00, presso il Teatro Storchi di Modena.
Per maggiori informazioni, per il programma completo e per iscriversi agli eventi, visita il sito: http://www.pni2016.unimore.it/
Text classification systems have been adopted by a growing number of organizations to effectively manage the ever growing inflow of unstructured information. The goal of text classification systems is to increase discoverability of information and make all the knowledge discovered available or actionable to support strategic decision making.
A text classification system requires several elements:
Essentially there are really just three main text classification algorithms in data mining: the “bag of keywords” approach, statistical systems and rules-based systems. Getting past all the marketing buzz to choose the best approach can be difficult. However, your selection of the best solution should be based on facts (and not claims). Here is a brief summary of each approach:
Given the increasing globalization of business and supply chains, current risk analysis techniques are unable to address the increasing complexity of the risk environment. To make sense of unstructured data, companies need a risk management framework to aggregate and analyze data more efficiently and effectively.
An effective risk management framework must be more than just a set of rules and standards. Instead, it must be able to deliver actionable results. At Expert System, we leverage our cognitive technology to ensure that all of the information you manage, from any internal, external or deep web source, is available for a complete view of your risk environment. Learn more about Cogito for risk assessment.