Yesterday, we briefly told you that we were at the opening keynote of RoboUniverse: IBM Watson Solutions and Cognitive Systems presented by Robert High. Now that we have managed to catch our breath, we would like to share a bit more about this great conference!
First introduced to the public in 2011 during an episode of Jeopardy!, Watson is IBM’s “technology platform that uses natural language processing and machine learning to reveal insights from large amounts of unstructured data”. In other -and simpler- words, Watson is a system that is meant to be an augmented version of the human cognition. It finds information and insights that help us make a decision. For example, it can provide a Doctor with data to enable him/her to decide about a treatment, which is valuable as the visits with a Doctor are usually short and when the data is large it takes time to select the relevant pieces of information.
This cognitive system learns its behavior through education, like a human would. It supports forms of expression that are more natural for human interaction so you don’t have to speak like a robot to interact ! The team is now working on making Watson understand tone changes and human emotions. (It is not only about algorithm, so the system presents its degree of confidence in its response)
Watson is not only one system, but an entire Ecosystem that is taking off. For example, Watson became a Chef to help Bear Naked develop its products thanks to cognitive cooking, and it shared some of its “smartness” with Cognitoys, toys created for educational needs of children.
Mr. High also introduced us to the work IBM is doing around Robots. There assumption being that robotic systems can facilitate the human-machine interactions with anthropomorphic animation, so it made sense to work on a robot using some of Watson’s intelligence. The robot -Pepper- learned, for example, how to estimate the age of the client and how to move its hands like humans do when they interact with clients.
His team is now working on making robots understand the different forms of human expression: written, verbal, visual and tactile.
The conclusion offereded us is that our computing systems will become cognitive, i.e. based on human expressions so the computers understand us better, rather than us understanding the computer
Watson’s Chronology :
In 2011: Jeopardy introduction
In 2012: Watson Discovery Advisor leverages the factoid pipeline around specific domains to help find the questions you are not thinking to ask
(This is a major process going beyond answering questions and coming out with new questions)
In 2013: Watson Engagement Advisor
In 2014: They realized that Watson was big, so IBM created an open platform
In 2015: The Watson Developer Cloud dramatically expanded the range of services designed to interpret the human condition, and made them available on Bluemix