School Project

Search Engine Project

Social Media Analytics Tool


Machine Learning Project

ABB Web Based IDE

This was a project on web software development course at Aalto University. In this project, we were tasked to build a JavaScript games marketplace. It also needs to be a platform where each submitted games could communicate with the platform. I lead a team of three to became one of the best team in the class. Our tech stack was Python, Django, PostgreSQL, Bootstrap, Vagrant and Heroku. The features that we developed are:
  • Social media login system
  • Browse and search games
  • Play games
  • Submit games
  • Record highscore
  • Save state of the game
  • Game developer dashboard which contains statistic of their sales and submitted games
  • Simple payment service integration
  • RESTful API
Demo: http://gamemart.herokuapp.com/
I worked at ABB initially as a master thesis worker and later on as a UX Designer. Here, I constructed UI/UX for their real time industrial IOT calculation engine. Built with vanilla Javascript. I also built a JS based C# parser/lexer which generate syntax/lexical tree out of the C# code input to be used in this web based IDE to provide smart autocompletion. As part of the thesis research, I also measure the productivity gains of using a specially constructed UI/UX that I built for the purpose of creating and/or managing the calculations.


2017-05-30 (8) Add New Calculation View Calculation List View Calculation List View


ABB as one of the leading power and automation company is connecting millions of electrical devices and systems to industrial internet of things called ABB AbilityTM. ABB AbilityTM is refining the measured real-time data with calculations to additional soft sensors signals and key performance indicators (KPIs) at the various levels from the system edges to central cloud. The engineering of the calculations requires web based IDE that provides good developer experience for the subject matter experts to be productive in their work. All configurations of the calculation engine are stored on the database engine. Currently, the calculation engine still doesn’t have a frontend view. So the user needs to work directly on the database to operate the calculation engine. This is a tedious work, especially when it comes to writing the calculation script. Thus, it is important to build a user friendly interface to configure the calculation engine. This interface needs to be integrated to ABB existing frontend dashboard system. One of the most important aspect of this UI is to have a versatile code editor. The user will need to write the calculation code directly on the front end web based system. So, it is vital to have an integrated code editor with some level of IDE functionalities embedded on it.


The objective is to design and implement web based engineering tool to enable end-to-end development of the calculations that includes:
  • Build UI for accessing and choosing the data source. It includes setting up parameters, parameter types, database mapping, dependencies and external libraries.
  • Build UI for defining the calculation. This is where we need to integrate the code editor.
  • Build UI for configuring the execution. It includes task definition, periodical based execution with scheduler, event based execution with triggers, batch jobs, simulations and diagnostics.
Yelp has this popular program among AI enthusiast which is called Yelp Dataset Challenge. In this challenge, they open up the data that they have for data scientists all over the world to be used in an innovative way. The challenge that I did was part of machine learning course project from Aalto University. The objective of this challenge was to predict user vote based on their reviews. We did that by utilizing occurrences of some words on the reviews. The occurrences of each words became the features for our machine learning model. We do both classification and regression problem on the similar data set. On classification problem, we try to predict the usefulness feedback from the user based on their reviews. Whereas on the regression problem, we predict the star rating that were given by the user based on their reviews. For the classification problem, we use decision tree learning model and logistic regression with some feature selection. While for the regression problem, we utilize the linear regression using gradient descent technique. We implemented everything in python. We code from scratch some of the model that we implement.


This was my first attempt on implementing real-world machine learning problem. Although last year I did something which is a little bit related with machine learning on my projects about information retrieval, but I did not dig deep into the machine learning approach for the search engine. It was very challenging for me to actually learn all the underlying math behind every major machine learning models out there because it has been a while since I touched upon the hard math subjects like calculus, probability theory, statistics etc. I had to relearn everything that I have learned years ago on my bachelor study to able to get a good understanding of the whole system.
It was a school project on Search Engine and Information Retrieval course  at KTH, Stockholm. The task was to implement several techniques used to build an efficient search engine system. We had to build a standalone search engine for DavisWiki dataset. It was coded using Java. I implemented some techniques like vector space model, PageRank algorithms, probabilistic monte carlo approach to Pagerank. We were also tasked to do evaluation of the search engine that we built.
This was a school project on Business Development Lab course in KTH, Stockholm. In this course, we were given task to build a real startup with a real product, real customers and solid business model. We had about 6 months to start everything from group forming, ideation, prototyping to product launching. So, the product idea that my team decided to work on was a social media analytics tool. The brand name  that we chose for this tool was LettuceMine, it was a pun of "Let-Us-Mine the social media data for you". With this tool, customer can track social media conversation about a keyword which they are interested in. They can see all sort of data analysis from those conversations. They can how many people are talking about a particular topic at any given time. They can see the sentiment analysis towards a topic. They can see the traction velocity of a topic to see beforehand if a topic will become a trend or not.