Skip to content

MOOC Viz Interns Week 5 Update

Lubo:

This week I found a new tool for building the dashboard – Shiny.

Shiny is a web framework for R that allows users to easily create good looking analytics dashboards directly in R, without the need of any HTML or backend programming. However, a great benefit of Shiny is that it is built around HTML and if you desire, you can directly add HTML elements or apply a CSS theme. Additionally, the framework offers a good range of filtering controls and supports dynamic content. As far as visualisations go, Shiny works with some of the most popular JavaScript charting libraries, including HighCharts, D3, dygraphs and more.

Given all of its features and the fact that it can easily be understood and shared amongst the R community, we have decided to use Shiny for the final dashboard.

Most of the work I’ve done this week has been comprised of researching and learning Shiny. Apart from that, I received access to the course survey data and I was given the task of making different visualisation by filtering learners based on their survey responses. To accomplish this, I produced two scripts – one which filters out learners based on their survey responses and another that makes different analyses based on the learner ids returned from the methods of the first script.

The next step is to use the scripts I have made to build an initial dashboard using Shiny.

Lin:

This week I have completed two tasks, importing the MySQL data to a remote server and fetching course data from FutureLearn.

The first one wasn’t very difficult because I have a script which is able to import data into localhost, I just needed to change arguments so that it can import it to a remote server. However, the script didn’t work well because I had some misunderstandings about how MySQL works. MySQL works locally, so if I want to connect to a remote server, I have to create an SSH tunnel and bind it to localhost. Fortunately, there is an external module – SSHTunnel which allows us to bind remote servers easily, so far it works without error.

The second task was harder for me because of my lack of experience. The goal was to create a script that will automatically download the course data from FutureLearn and upload it to the MOOC Observatory at regular periods. To accomplish this I had to write HTTP requests in Python. Given that I have never learned anything related to HTTP before, it took me a few days to build some basic knowledge. Currently, I am just waiting for an admin account because I need to analyse admin webpage. Additionally, I need to decide a suitable period to update data depending on web server.

I think our current progress is good and I believe we are able to finish our project on time. Hopefully nothing will go wrong in the near future. I will also try my best on this project in the following weeks

Leave a Reply

Your e-mail address will not be published. Required fields are marked *