Deep Learning in Real Time – Inference Acceleration and Continuous Training

Introduction Deep learning is revolutionizing many areas of computer vision and natural language processing (NLP), infusing into increasingly more consumer and industrial products intelligence capabilities with the potential to impact the everyday experience of people and the standard processes of industry practices. On a high level, deep learning, similar to any automated system based on […]

Continue reading


Customer Lifetime Value Prediction Using Embeddings

Introduction This paper was presented during the RE·WORK · Deep Learning in Retail & Advertising Summit (London). It describes the Customer Life Time Value (CLTV) prediction system deployed by ASOS.com, an online clothing shopping website. For e-commerce companies, being able to better predict the CLTV delivers a huge benefit for the business. This paper provides […]

Continue reading


30 Years of Devotion to Natural Language Processing Based on Concepts

Recently jiqizhixin.com interviewed Mr. Qiang Dong, chief scientist of Beijing YuZhi Language Understanding Technology Co. Ltd. Dong gave a detailed presentation of their NLP technology and demoed their YuZhi NLU platform. With HowNet, a well-known common-sense knowledge base as its basic resources, YuZhi NLU Platform conducts its unique semantic analysis based on concepts rather than […]

Continue reading


Relearn the Linguistic World in “Arrival”: An Interview with Jessica Coon

In the movie Arrival, language is one of the most powerful weapons. On top of that, the film’s consultant, Jessica Coon, believes that language is its own origin; that language originates from itself. Jessica Coon is Associate Professor of linguistics at McGill University, Canada Research Chair in Syntax and Indigenous Languages, and linguistics expert in […]

Continue reading


Applying Multinomial Naive Bayes to NLP Problems: A Practical Explanation

1.Introduction Naive Bayes is a family of algorithms based on applying Bayes theorem with a strong(naive) assumption, that every feature is independent of the others, in order to predict the category of a given sample. They are probabilistic classifiers, therefore will calculate the probability of each category using Bayes theorem, and the category with the […]

Continue reading


Feedback Sequence-to-Sequence Model – Gonna Reverse Them All!

This tutorial assumes that you have a pretty good understanding about the basics of Recursive Neural Networks and Backpropagation Through Time (BPTT) and how these models actually work Terminologies One-to-one: Problems that are concerned into getting a direct relation between an input word and output word. For example, the relation (like, love) is considered to […]

Continue reading


Deep Learning Paper Sparks Online Feud!

Feature image is created by Jannoon028 – Freepik.com Researchers Yoav Goldberg and Yann Lecun face off on Natural Language Processing Social media is humanity’s new intellectual battlefield. Sports fans, social justice warriors, and even the President of the United States tweet, take to discussion boards or make memes to mock, preach, thrust and parry against […]

Continue reading


Epic’s Tim Sweeney: Deep Learning A.I. Will Open New Frontiers in Game Design

“[Video game] AI is still in the dark ages,” Epic CEO Tim Sweeney told a crowd gathered for Games Beat’s 2017 industry summit. The video game industry has witness a tremendous amount of growth, thanks to the incredible increase in computation power in terms of visual representations. Using the parallel computation ability of GPUs, powerful […]

Continue reading


[Thesis Tutorials II] Understanding Word2vec for Word Embedding II

Previously, we talked about Word2vec model and its Skip-gram and Continuous Bag of Words (CBOW) neural networks. Regularly, when we train any of Word2vec models, we need huge size of data. The number of words in the corpus could be millions, as you know, we want from Word2vec to build vectors representation to the words […]

Continue reading


[Thesis Tutorials I] Understanding Word2vec for Word Embedding I

Vector Space Models (VSMs): A conceptual term in Natural Language Processing. It represents words as set of vectors, these vectors are considered to be the new identifiers of the words. These vectors are used in the mathematical and statistical models for classification and regression tasks. Also, they shall be unique to be able to distinguish between […]

Continue reading