© 2018 American Innovations Consulting. All Rights Reserved.

The Complementary (vs. Competitive) Relationship between Classic Natural Language Processing & 21st Century Machine Learning

March 21, 2018

For those who might be peripherally aware of computer science’s ML and NLP sub-disciplines, there’s a good chance you’ve heard the two terms used interchangeably – particularly in business and marketing settings. The two domains, however, have dramatically different origins with specialists in each that don’t necessarily have the expertise to solve problems in the other domain. To further elaborate, NLP emerged in the mid-20th century and was tightly-coupled with linguistic principals, which meant that a given language’s syntax, morphology, lexicon, etc. – the entire grammatical structure that constitutes a language’s very essence – needed to be programmed into its resultant NLP software. Further, for those requiring NLP capabilities in multiple languages, this linguistic and computer-intensive heavy-lifting needed to be performed individually for each desired language.

 

Machine learning, on the other hand, is rooted in hardcore mathematics, using numerical values – not human-derived language – as its primary inputs and outputs. Although this field also emerged in the same era as its NLP counterpart, it’s made some earth-shattering headway this past decade thanks to a combination of factors, including the explosive and exponential increases in data (thank you, mobility and social media) and the rise of a machine learning sub-discipline called ‘deep learning’.

 

If machine learning has successfully been able to accomplish tasks such as recognizing objects in video and images, making purchasing recommendations to online shoppers, detecting unwanted plagues such as fraud and malware, etc., then surely it must be able to solve any and all problems related to human-generated text, right? Based on findings over the last few years, the answer, perhaps surprising to some, is no. Certainly, there are mathematically-rooted techniques that have been successfully applied to human-generated text for the last few decades, but mathematically-rooted techniques have not nor most likely ever will completely usurp its linguistics counterpart from the speech and text equations – language is just too abstract, too complex, too imperfect, and too ambiguous for the purities of math. 

 

Through our expertise and partnerships with NLP companies such as Bitext, we’ve witnessed firsthand how NLP has become the fuel that enables the machine learning fire to burn on multiple endeavors as it pertains to successfully processing human-generated speech and text. In fact, it’s Bitext’s bot middleware that enables the online ‘chatbot’ to perform more accurately and human-like than an alternate chatbot technology void of any linguistic knowledge.  

 

In conclusion, wherever 21st Century AI is proclaimed to be operating under-the-hood of a solution involving human-generated text, a buyer, investor, user, etc. of such technology would be wise to ensure that NLP is also present. Without NLP, the human element doesn’t exist; without a human element, humans themselves will most likely give up on using AI technology. After all, we weren’t designed to be on this planet for computers, but for each other.

Share on Facebook
Share on Twitter
Please reload

Featured Posts

How the US Federal Government is Solving Their 'Innovator's Dilemma'

August 20, 2018

1/1
Please reload

Recent Posts
Please reload

Archive
Please reload

Search By Tags
Please reload