10
Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda [email protected] Instituto Superior T ´ ecnico, Lisboa, Portugal October 2018 Abstract Mentoring programmes within universities have shown to improve students’ academic performance. The Mentoring Programme at T ´ ecnico Lisboa’s Taguspark campus has had a positive impact on students. Yet, it suffers from communication problems between the parties involved. A software solution named NAPP had been created to address these communication problems. NAPP’s Web application is a component that is meant to be used as a high-level performance analysis tool by the mentoring programme’s coordinator. The previous prototype of the Web application did not present the information necessary for the proper analysis of students’ academic performance. A new Web application was created which uses data visualisations to describe students’ academic performance. Further improve- ments were made to the other component of NAPP: the prototype of a mobile application to be used by mentors and mentees. The first beta test of the mobile application was conducted with a small group of students for 3 months, revealing that students used the application at least once a week, except for one week. A usability test on the new Web application showed high user satisfaction (average SUS score of 95). NAPP is ready to undergo a formal test which will allow to evaluate the system’s value for the Mentoring Programme in Taguspark. Keywords: Mentoring Programme, Student Success, Student Support Systems, Data Visualisation 1. Introduction The Mentoring Programme is a T´ ecnico Lisboa pi- oneer project first implemented in 1996. It focuses on the welcoming, integration, and assistance of students admitted for the first time to T´ ecnico Lis- boa, who are mainly first-year and international students. The new students receive personalised assistance from their peers – volunteer students who serve as mentors in the programme – dur- ing their initial contact with T ´ ecnico Lisboa and through the remainder of the first academic year. In the academic year 2011/2012 the Mentoring Pro- gramme went through a revamp in the Taguspark campus. The Mentoring Programme in Taguspark (MP-TP) uses a different strategy for accepting mentors: they are still volunteer, but they are inter- viewed beforehand in order to be accepted into the MP-TP. Another differentiating factor of the MP-TP is that the academic performance of first-year stu- dents is closely followed by a counsellor, who is the coordinator of the mentoring programme. Since the restructuring of the Mentoring Programme in Taguspark, the programme has had a positive im- pact on the academic success of first-year students at the campus. [1] In order to achieve the intended goals of aca- demic success and integration, the MP-TP de- pends on the exchange of information between three parties: the MP-TP’s coordinator, mentors, and mentees. Despite the positive impact of the programme on first-year students, there are ad- mitted communication problems between the three parties involved. The main communication prob- lem concerns the reporting of grades by mentees. In order to follow mentees’ academic performance, the MP-TP’s coordinator produces and analyses a spreadsheet with the grades reported by the mentees. By analysing the negative grades re- ported, the coordinator determines if a mentee is at risk of academic failure. For the coordinator to receive the grades, the re- porting of grades flows in a sort of “boomerang” fashion. The MP-TP’s coordinator has to message the mentors for their mentees’ grades, who will then message their mentees asking for the grades. The mentees then have to report their grades to their respective mentors, and afterwards the men- tors forward their mentees’ grades back to the co- ordinator. This is where the communication prob- lem lies, for two reasons: the reporting of grades is slow due to the “boomerang” flow, and the analysis of grades is also slow due to delay associated with using a very basic spreadsheet. This compromises the ability to provide timely help to at-risk mentees. 1

Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda [email protected] Instituto Superior Tecnico,

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

Visualising Data in NAPP Web Application

Catarina Alexandra Pereira [email protected]

Instituto Superior Tecnico, Lisboa, Portugal

October 2018

Abstract

Mentoring programmes within universities have shown to improve students’ academic performance.The Mentoring Programme at Tecnico Lisboa’s Taguspark campus has had a positive impact onstudents. Yet, it suffers from communication problems between the parties involved. A software solutionnamed NAPP had been created to address these communication problems. NAPP’s Web applicationis a component that is meant to be used as a high-level performance analysis tool by the mentoringprogramme’s coordinator. The previous prototype of the Web application did not present the informationnecessary for the proper analysis of students’ academic performance. A new Web application wascreated which uses data visualisations to describe students’ academic performance. Further improve-ments were made to the other component of NAPP: the prototype of a mobile application to be used bymentors and mentees. The first beta test of the mobile application was conducted with a small group ofstudents for 3 months, revealing that students used the application at least once a week, except for oneweek. A usability test on the new Web application showed high user satisfaction (average SUS scoreof 95). NAPP is ready to undergo a formal test which will allow to evaluate the system’s value for theMentoring Programme in Taguspark.Keywords: Mentoring Programme, Student Success, Student Support Systems, Data Visualisation

1. Introduction

The Mentoring Programme is a Tecnico Lisboa pi-oneer project first implemented in 1996. It focuseson the welcoming, integration, and assistance ofstudents admitted for the first time to Tecnico Lis-boa, who are mainly first-year and internationalstudents. The new students receive personalisedassistance from their peers – volunteer studentswho serve as mentors in the programme – dur-ing their initial contact with Tecnico Lisboa andthrough the remainder of the first academic year. Inthe academic year 2011/2012 the Mentoring Pro-gramme went through a revamp in the Tagusparkcampus. The Mentoring Programme in Taguspark(MP-TP) uses a different strategy for acceptingmentors: they are still volunteer, but they are inter-viewed beforehand in order to be accepted into theMP-TP. Another differentiating factor of the MP-TPis that the academic performance of first-year stu-dents is closely followed by a counsellor, who is thecoordinator of the mentoring programme. Sincethe restructuring of the Mentoring Programme inTaguspark, the programme has had a positive im-pact on the academic success of first-year studentsat the campus. [1]

In order to achieve the intended goals of aca-demic success and integration, the MP-TP de-

pends on the exchange of information betweenthree parties: the MP-TP’s coordinator, mentors,and mentees. Despite the positive impact of theprogramme on first-year students, there are ad-mitted communication problems between the threeparties involved. The main communication prob-lem concerns the reporting of grades by mentees.In order to follow mentees’ academic performance,the MP-TP’s coordinator produces and analysesa spreadsheet with the grades reported by thementees. By analysing the negative grades re-ported, the coordinator determines if a mentee isat risk of academic failure.

For the coordinator to receive the grades, the re-porting of grades flows in a sort of “boomerang”fashion. The MP-TP’s coordinator has to messagethe mentors for their mentees’ grades, who willthen message their mentees asking for the grades.The mentees then have to report their grades totheir respective mentors, and afterwards the men-tors forward their mentees’ grades back to the co-ordinator. This is where the communication prob-lem lies, for two reasons: the reporting of grades isslow due to the “boomerang” flow, and the analysisof grades is also slow due to delay associated withusing a very basic spreadsheet. This compromisesthe ability to provide timely help to at-risk mentees.

1

Page 2: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

In this process, the reporting of grades reliesalmost completely on the exchange of emails be-tween the three parties. Since the academic year2017/2018, mentors began writing their mentees’grades directly on a collaborative spreadsheet inGoogle Sheets, which the coordinator uses to anal-yse the academic performance of mentees. Thischange allowed for a slight reduction of the delaysince the spreadsheet is now produced in a dis-tributed manner, but the time associated with the“boomerang” flow did not decrease, nor did thetime of analysing the spreadsheet.

The NAPP framework is a software solution thatwas developed as the Master’s degree thesis of Pe-dro Veiga, to try to tackle the problems of the Men-toring Programme in Taguspark. [2] Its two maincomponents are a mobile application to be usedby mentors and mentees, and a Web application tobe used by the MP-TP’s coordinator. A prototypeof the mobile application was made and tested forusability with a small group of students. It supportsthe reporting of grades by mentees, and the mon-itoring of mentees’ performance by mentors. It al-lows students to submit feedback about the MP-TP,and also includes study guidance tools to assistthem in their academic life. With NAPP’s originalversion, mentees report their grades using the mo-bile application, which makes them instantly avail-able to the mentors and MP-TP’s coordinator, thuseliminating the “boomerang” flow.

A prototype of the Web application was alsomade, but it did not undergo any kind of testing.NAPP’s Web application is intended to serve as aperformance analysis tool for the MP-TP’s coordi-nator. Yet, the first prototype did not satisfy the co-ordinator’s needs for analysing each student’s aca-demic performance.

The first version of NAPP had other problemsbesides the prototype of the Web application.NAPP was not integrated with the evaluation meth-ods of Tecnico Lisboa’s courses: the grades re-ported by mentees were associated only with acourse, and not with a specific evaluation of thecourse (e.g. tests, projects, etc.) This associationis critical to the correct analysis of the mentees’academic performance by the coordinator.

1.1. Key Contributions

This work presents the various enhancementsmade to NAPP’s original version. The data modelwas redesigned for improved scalability; the mo-bile application was subject to significant modifi-cations; and the inadequate first Web applicationwas discarded and a new one was created accord-ing to the requirements of the MP-TP’s coordina-tor. Moreover, three tests were conducted includ-ing: a short duration beta test on the mobile appli-

cation with a small group of students; a long dura-tion beta test on the mobile application, also witha small number of students; a usability test of theWeb application.

1.2. Document OutlineThis document is divided into five sections. Section2 presents the related work, including two men-toring software solutions and the case of a suc-cessful use of academic data for the benefit of stu-dents. It also introduces the NAPP framework, as itwas originally developed. Section 3 explains all thework that was done to enhance NAPP and the var-ious phases of evaluation. Section 4 presents theresults and discussion. Lastly, Section 5 explainsthe conclusions of this work and outlines the futurework.

2. BackgroundOrganisations around the world are progressivelychoosing to use specialised mentoring software toenrich the experience of participants taking part inmentoring programmes.

These mentoring software solutions concentrateon the interactions between mentors and mentees,and, even when they offer variants that target men-toring programmes in higher education, they donot include any capabilities for directly analysingstudents’ academic progress. Since this is one ofthe fundamental aspects of the MP-TP, there wasa need to investigate another topic which relatesdeeper with students’ academic success: learninganalytics. In recent years, higher education organi-sations have started to take advantage of the largeamounts of data generated by their Learning Man-agement Systems to provide insights on how toimprove learning. [3] Georgia State University is arecognised successful case of the use of learninganalytics to improve student success.

The two following subsections present twoprominent software solutions for mentoring pro-grammes, as well as the successful case of Geor-gia State University with learning analytics. A thirdsubsection introduces the NAPP framework, ex-plaining its original state.

2.1. Mentoring softwareChronus and MentorCore are two mentoring soft-ware solutions that can be applied to mentoringprogrammes within universities.

Both Chronus and MentorCore provide matchingcapabilities, which use mentor and mentee infor-mation to suggest suitable matches. Matching cri-teria are customisable, and can be defined by men-toring programme administrators. This technologyis the best option when dealing with large numbersof mentors and mentees. However, it isn’t partic-ularly useful in the context of the Mentoring Pro-

2

Page 3: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

gramme in Taguspark, because the number of par-ticipants is too small (approximately 250 menteesand 80 mentors). Thus, the matching has beenrandomised since the beginning of the programme,with the only constraint being that the mentor andthe mentee must belong to the same academic de-gree. It is assumed that this randomisation has hadno meaningful impacts on the results of the pro-gramme.

Within Chronus, mentors and mentees can en-gage in the programme through a Web applicationand native mobile applications for smartphonesand tablets. These applications offer not only di-rect communication tools between mentors andmentees, but also mentoring resources for newmentors, as well as a forum where participants cansubmit questions and answers. Unlike Chronusand NAPP, MentorCore does not provide a mobileapplication to programme participants. Instead, aWeb application allows mentors and mentees toapply to the mentoring programme, find suitablematches, and engage in the programme.

Among the two, these mentoring software solu-tions offer monitoring and and management fea-tures to be used by mentoring programme admin-istrators, including:

• A dashboard which contains real-time statusreports regarding the mentoring programme,

• Workflow management, which enablesmentoring programme administrators to de-fine tasks and milestones for mentors andmentees,

• Simple visualisations of programme data,which allow to analyse programme outcomesand objectives,

• Creating customisable reports, which can beexported into Microsoft Excel,

• Creating surveys to be answered by mentorsand mentees,

• Uploading resources and documents for men-tors.

Like these two systems, NAPP is intended to bea technological solution that promotes the com-munication between all the parties involved in amentoring programme. However, what sets NAPPapart from Chronus and MentorCore is the specialattention to the students’ academic success. Noneof these other systems provide modules that havethe explicit focus set on following the academictrack record of participants – not even Chronus,which has been applied in academic contexts sev-eral times before.

2.2. Learning Analytics at Georgia State UniversityIn the years leading up to 2012, academic ad-visement at Georgia State University (GSU) wasshowing signs of decay. The university supportedsix advising offices which had developed indepen-dently across multiple colleges, and thus had littlecoordination, no sharing of records, and no com-mon training. This separation resulted in an ineffi-cient and ineffective academic advisement systemwhich often failed to provide assistance to GSU’slarge number of at-risk students. [4]

To try to solve this problem, a Web-based advis-ing platform was built in collaboration with the Ed-ucation Advisory Board (EAB). The platform con-sists of an early-warning system, named Gradua-tion and Progression Success (GPS, or GPS Ad-vising), which tracks GSU’s 30,000 students on adaily basis, and applies ten years of historical data(2.5 million student grades) to create alerts andpredictive analytics. [5] These alerts and analyticsare delivered to adviser dashboards every morn-ing, prompting student meetings with advisers, aswell as providing guidance during those meetings.The platform keeps track of meetings with advis-ers and allows them to write notes on a permanentrecord available to all advisers in the university.

Georgia State University and EAB began devel-oping the GPS Advising system in January 2012.The system went live in August 2012, and EABprovided common training to the university’s ad-visers. [4] By 2017, the university-wide graduationrate has increased 22 percentage points, standingat 54%, and achievement gaps were eliminated forat-risk students.

Georgia State University is a very successfulcase that highlights how data can be used to ben-efit students. The GPS Advising system sharessome aspects with NAPP, such as the idea of de-livering relevant alerts to the adviser’s dashboard,or the visualisation of each student’s progress inthat same dashboard. Yet, the success of the GPSAdvising initiative depends on the students’ willing-ness to actively interact with their assigned adviser.When students enter the higher education system,they can often feel overwhelmed and lost. PeerMentoring in higher education helps new studentsfeel welcome and integrated, which often trans-lates into their academic success. The core goalof NAPP is to join technology, Peer Mentoring andacademic data to ultimately contribute to students’academic success.

2.3. The NAPP frameworkThe NAPP framework is designed from the begin-ning to consist of three elements: the mobile ap-

3

Page 4: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

plication, the Web application, and a server whichhosts the database that keeps all the data gener-ated by the applications.

The mobile application was built using the IonicFramework, which provides tools to develop cross-platform mobile applications using Web technolo-gies like CSS, HTML5, and JavaScript. The mobileapplication is divided into two views: the menteeview and the mentor view. It allows mentees toreport their grades – though this feature was notintegrated with the evaluation methods of TecnicoLisboa’s courses – and mentors to follow theirmentees’ academic evolution. It also includesstudy guidance tools for mentees, such as a listof tasks integrated with a Pomodoro timer. Men-tors can also add tasks to their mentees’ tasklists through their view in the mobile application.The mobile application also has a feedback featurewhich mentees can use to submit their suggestionsfor improving the MP-TP.

The Web application uses a front-end dashboardtemplate, ng2-admin, which provides tools to cre-ate charts and tables. The Web application con-sists of two pages. The first page displays a smallamount of aggregated data, and the second pagecontains a table that displays the student informa-tion, each row corresponding to a student.

The server hosts an Apache CouchDBdatabase. CouchDB is database softwarethat has a document-oriented NoSQL databasearchitecture. The CouchDB database hosted onthe server stores all the data related with theNAPP framework. Local data storage on themobile application and on the Web application isdone through PouchDB. PouchDB is open-sourceJavaScript database software which was createdto facilitate the development of offline applications.It also has a document-oriented NoSQL databasearchitecture. PouchDB stores data locally onthe device, and then syncs those data to theserver-side CouchDB database when an Internetconnection is available. The system architecture,as described above, is presented in Figure 1.

3. MethodologyFrom the beginning, it was known that the evalua-tion of this work was highly dependent on whetherthere would be real data to analyse how valuableNAPP will be to the MP-TP. For this reason, thefirst objective was to get the mobile application to astate where it was ready to be used by mentors andmentees. Since the mobile application was initiallydeveloped as a prototype, it had only been testedfor usability; and NAPP as a whole had only beenrun on a local computer environment.

At that time, a considerable flaw of NAPP wasrelated to the reporting of grades: the grades re-

Server

NAPP database

CouchDB

syncsync

Web Application

MP­TP's Coordinator

Mobile Application

Mentees & Mentors

ng2-admin

Figure 1: The original architecture of the NAPP framework

ported by students were not linked to an evalua-tion, which would make it very difficult for the MP-TP’s coordinator to follow the students’ progressaccurately. The approach chosen was to modelthe courses’ information, including the evaluationmethods, and store it in a database, named theCourses database. NAPP’s mobile application wasthen modified to retrieve courses’ information fromthe Courses database, allowing students to chooseboth a course and one of its evaluations when re-porting a grade.

3.1. Structured testWith these changes, NAPP was ready for a firstround of beta testing. The system was tested witha small group of students for three days, from the1st to the 3rd of March, 2018. CouchDB was setup on a desktop computer, and the mobile applica-tion was built for Android and installed on the stu-dents’ mobile devices. The students were given asmall guide which had a set of chores to completeon given days, along with the expected results ofeach chore. The goal was to make sure that thesystem behaved as expected. Because of the verymethodical nature of this test, it was named the“structured test”.

During the structured test some critical problemswere found that needed to be fixed. The mostsevere of these problems had to do with NAPP’sdata model. The test guide was carefully plannedto test the offline behaviour of the mobile appli-cation, and the data model proved to be unfit tohandle the mobile application’s requirements. Atthat point, all the data generated by users werestored in a single database, named “napp data”.This approach of centralising all the users-relateddata in a single collection of documents (one foreach user) does not scale well: firstly, it makesqueries to the database unnecessarily complexand slow; secondly, it is very prone to creatingreplication conflicts. A replication conflict happens

4

Page 5: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

when CouchDB is trying to sync a remote databasewith the local databases on the users’ devices, butthe data being synced are incompatible, so one ofthe versions is discarded. This often happens dueto concurrent offline writes, which were induced bysome of the guide’s chores. Since NAPP did notimplement any conflict resolution mechanisms, thedata in the conflicting versions were lost. Theseproblems prompted an extensive redesign of thedata model. The problematic approach of stor-ing all the user data in a single database docu-ment was abandoned, and instead, user-relateddata is now stored in three different databases.The Users database has a document for each userwhich stores the user’s personal information and,if the user is a mentee, their grades. The Tasksdatabase stores a new document for each taskthat is created (previously, the tasks of a menteewere stored in the user document, which wouldcause replication conflicts when the mentee cre-ated a task and the mentor assigned a task to thementee while one of them was offline). The Feed-back database stores all the feedback submitted bythe users, which facilitates accessing and query-ing all the suggestions (as opposed to the old datamodel, where the feedback was stored in the userdocument). An additional database, named Logdatabase, keeps a record of relevant actions per-formed by users, which will allow to analyse theimpact of NAPP on the MP-TP in the future.

Other problems were found during the structuredtest, for example, one problem had to do with theinteraction flow of the authentication process inthe mobile application. The original authenticationprocess confused mentees who took part in thetest – some of them ended up accidentally sign-ing up for mentor accounts because the processwas not clear. The interaction flow was then mod-ified, which included redesigning the authentica-tion screens on the mobile application. Addition-ally, the concept of authorisation was introducedinto the authentication process. Previously, any-one could sign up to use NAPP’s mobile applica-tion, making it vulnerable to malicious exploits. Anauthorisation stage was added to the authentica-tion process, which ensures that only mentors andmentees who were previously enrolled can sign upto use the mobile application.

3.2. Unstructured test

With these problems fixed, the next objective wasto test the mobile application in an a real envi-ronment for nearly an entire semester, to be ableto collect the data needed to evaluate the valueof NAPP to the MP-TP. The ideal course of ac-tion was to test the mobile application on both An-droid and iOS devices, to collect data from as many

users as possible. At that point, the acquisition ofthe licences for publishing the application on “appstores” had not yet been authorised by Tecnico Lis-boa’s administrative services, so it was necessaryto turn to an alternative course of action: distributethe application, to the users who would participatein the test, outside of official channels.

Regarding iOS devices, the only way to install anunpublished application to the devices is to start aan official beta test, which still requires a licence.On Android, on the other hand, one only needs tobuild the mobile application into an Android pack-age, which can be installed on any Android de-vice, without the need for a license for distribu-tion through Google Play. Therefore, while the li-cences were not acquired, it was only possible totest NAPP’s mobile application on Android devices.

Two short presentations were given to try togather students to participate in the test. Dur-ing these presentations, most of the mentees whoshowed interest happened to have iOS devices. Inthe end, 5 mentors and 11 mentees participatedin the test. During three months of the secondsemester of the academic year 2017/2018, the mo-bile application was used by these students. Thestudents were presented the features of the mobileapplication but were not given any guide to follow,they were free to use the mobile application as theypleased. Because of the more informal nature ofthis test, it was named the “unstructured test”.

3.3. The new Web application

While the tests took place, another core part of thiswork was being developed. NAPP’s Web applica-tion is a fundamental component of NAPP, becauseit is the element that aims to help the MP-TP’s co-ordinator to analyse the students’ grades more effi-ciently. The problem was that the existing Web ap-plication did not suit the needs of the coordinator.In the original Web application, all the user infor-mation was displayed in a table, but this table didnot resemble the spreadsheet already used by thecoordinator and could not be manipulated like thatspreadsheet. The previous Web application wasdeveloped using a front-end dashboard template,named ng2-admin, and this template was sufficientfor the very few features of the application. How-ever, due to its high complexity and poor docu-mentation, the template was not suitable for thecontinued improvement that NAPP requires. Forthese reasons, the existing Web application wasdiscarded and a new one was built according tothe coordinator’s requirements.

The new Web application was developed withAngular, without the use of any template. The Webapplication employs charts to facilitate the visual-isation of information, and the Chart.js library is

5

Page 6: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

used to draw them. Chart.js is a charting libraryfor JavaScript that has support for various types ofcharts and interactivity. The Web application is di-vided into five features.

3.3.1. The Courses featureThe Courses feature gives the MP-TP’s coordi-nator the ability to manage courses’ information,and to visualise and analyse the results of eachcourse’s evaluation. The Courses feature is com-prised of three pages: the list of courses, thecourse page, and the page to add a new course.

The list of courses displays a list of cards, eachrepresenting a course. Each card has a hyperlinkto the course’s page, and also shows basic infor-mation about the course.

The course page is the most important page inthe Courses feature. The main goal of this page isto facilitate the analysis of students’ results for eachevaluation of the course. The page provides a wayto quickly identify the students who had negativegrades in an evaluation, and automatically calcu-lates basic statistics of each evaluation (averagegrade, number and percentage of negatives andpositives). The distribution of students’ grades isplotted in a bar chart which adapts to the grade in-tervals of each evaluation (some evaluations do notfit the usual 0-20 scale). Figure 2 shows the distri-bution of grades for first test of the “Algebra Linear”course, whose grade interval ranges from 0 to 5.

Figure 2: The course page of the Web application

There is a table below the distribution of gradeswhich shows the list of students and their grades.When a bar in the chart is clicked, the table willbe filtered to only show the students who obtainedthe grade relating to the selected bar. On theright-hand side of the page, a circular chart showshow the students are partitioned in terms of num-

ber of positive, negative, and not reported grades.Additionally, the same statistics are presented intext format below the chart, along with the averagegrade for the evaluation

The last page of the Courses feature is the pageto create a new course, by filling in the necessaryinformation. The page contains a form that is di-vided into two parts: first the user inserts the basicinformation about the course, then the user addsthe evaluation method. There are field validationrules in place, and error messages are shown tothe user if any field does not follow the rules.

3.3.2. The Students featureThe Students feature allows the MP-TP coordi-nator to view and manage information related tothe students who take part in the mentoring pro-gramme. Most importantly, the coordinator is ableto follow each student’s academic progress. TheStudents feature is comprised of four pages: thestudents list, the student page, the page to enrolmentors, and the page to view unregistered stu-dents (students enrolled in the programme whohave not yet signed up on the mobile application).

The students list displays the registered studentswho take part in the MP-TP. The students’ pagescan be accessed through this list or by means of asearch bar at the top of the page.

The goal of the student page is to allow theMP-TP’s coordinator to follow a specific student’sprogress. In the case of mentees, this refersto the academic progress; in the case of men-tors, it concerns the progress within the mentoringprogramme. The student page has two possibleviews, depending on the role of the student: thementee view and the mentor view.

The mentor view has very little functionality atthe moment. It only shows the mentor’s personalinformation and the list of mentees (both registeredand unregistered).

The mentee view shows the mentee’s personalinformation, grades, use of the study tools onthe mobile application, and allows to switch thementee’s mentor in the current academic year. Foreach semester, user grades can be visualised in aa line chart (Figure 3). The chart allows for eas-ier visualisation of the evolution of the student’sgrades over time. The horizontal axis representsdates within the selected semester. The verticalaxis represents the grades obtained by the stu-dent, normalised to the 0-100% range (normali-sation was needed because grade intervals varyamong the evaluations). Below the chart, the pagealso displays tables, where it is possible to viewthe student’s grades in every evaluation of eachcourse.

Another page in the Students feature allows the

6

Page 7: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

Figure 3: The tab that shows the mentee’s grades on the stu-dent page

coordinator to enrol new mentors. The enrolmentof new mentors relates to the new authorisationmechanisms introduced before the unstructuredtest: mentors must be enrolled beforehand in or-der to be able to sign-up using the mobile applica-tion. To enrol new mentors, the coordinator mustinsert an academic year and the IST IDs of thementors. For convenience, instead of having totype out the IST IDs of the students, the coordi-nator should submit an Excel file. There are in-structions on screen on how to correctly create theExcel file.

3.3.3. The Alerts feature

The objective of the Alerts feature is to deliveralerts about relevant events within NAPP to theWeb application. At this point, the only event thatwas identified as being valuable is when a studentreports a negative grade. Thus, the Alerts featureserves the purpose of helping the MP-TP’s coordi-nator to identify at-risk students. The Alerts featurewas designed with extensibility in mind, so that, inthe future, it can support different types of alerts,and not only alerts relating to the report of neg-ative grades. The goal is to give the coordinatorthe ability to customise, within a set of predefinedparameters, what types of alerts they would liketo receive. For this purpose, it was necessary tocreate and model an entity that can describe thepreferences of the coordinator. The coordinator’spreferences are kept in a document in the Usersdatabase – the coordinator’s document. The Alertsfeature uses Firebase Cloud Messaging (FCM) todeliver the alert as a push notification, which ap-pears in the coordinator’s device even if they arenot using the Web application at that time. The co-ordinator has to allow the Web application to showpush notifications to activate this functionality, andthey can opt out at any time – the alerts will still

be delivered inside the Web application even if thecoordinator has opted out of push notifications.

The Alerts feature is a programme that runs con-tinually on the server. The program syncs with theLog database and listens to changes, so that everytime a new document is written to the database,it will trigger the execution of some procedure as-sociated with the Alerts feature. This proceduredetermines whether the event associated with thenew log document should cause an alert to be de-livered to the MP-TP’s coordinator, and it also han-dles the delivery of the alert (both to the Web appli-cation and through FCM). In the Web application,the list of alerts is shown in a panel that can beopened by clicking a “bell” button on the top-rightcorner of the screen, which will show the paneloverlaying the current page.

3.3.4. The Dashboard featureThe goal of the Dashboard is to allow the coordi-nator to view relevant, up-to-date information aboutthe MP-TP without the need to browse through theother pages of the Web application. This informa-tion is conveniently aggregated and presented inthe form of “cards” in the Dashboard page (Fig-ure 4).

Figure 4: The Dashboard

The information presented in the Dashboardpage includes:

• A table with statistics about evaluations thattook place within the last month. For ev-ery recent evaluation, the card shows howmany grades have been reported, how manyof those are negative grades, and the averagegrade,

• A table with the dates of evaluations that willhappen within the next two weeks,

• A list of at-risk students, featuring those whohave reported at least one negative grade,

7

Page 8: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

• The number of enrolled versus unregisteredmentors and mentees. This is particularly rel-evant at the start of the academic year, so thatthe coordinator can keep track of how manystudents have not yet created an account inthe mobile application.

• A visualisation of the activity logs generatedby the students using the mobile application,in the past week and month.

As this is a Dashboard, it is important that theinformation is up-to-date, so the page is configuredto refresh every five minutes.

3.3.5. The Feedback featureThe Feedback feature is made up of a single pagewhere the MP-TP’s coordinator can view the sug-gestions submitted by mentees. The suggestionsare sorted by date, showing newest suggestionsfirst. Above the list there is a search box that canbe used to search for words within the text of thesuggestions.

3.3.6. Usability testingIn order to be able to detect flaws in its design, theWeb application needed to undergo usability test-ing. In a usability test, the users are asked to per-form a set of tasks using the system. These tasksshould represent a realistic scenario of usage. Inthis test, the Courses and Students features weretested individually, but the Dashboard and Feed-back features were tested as a group. The decisionof grouping these two features was based solely onthe small number of tasks for them. The Alerts fea-ture was not tested because of the lack of mean-ingful tasks concerning this feature.

It is important to note that the purpose of thesetests was to evaluate the Web application’s fea-tures from a qualitative viewpoint: the goal was togain insight into common errors, to be able to cor-rect them. The purpose of this test was not to doa quantitative usability study by analysing metricssuch as user efficiency, number of user errors orsubjective satisfaction. As advocated by Nielsen,when running a usability test that is not meant fora quantitative usability study, testing five users al-most always reaches the maximum benefit-cost ra-tio of user testing. [6, 7] For these reasons, theCourses feature was tested with seven users andthe remaining features were tested with five users.

To gain a general idea of how the users feltabout the system, they answered the System Us-ability Scale (SUS) survey after testing each fea-ture. The SUS is a means of assessing the usabil-ity of a system by means of a survey answered bythe users taking part in the test. The survey con-sists of 10 items which the users must answer ona scale of 1 to 5 (1 corresponding to “strongly dis-

agree”, and 5 to “strongly agree”). The answersare then converted into a score which ranges be-tween 0 and 100. [8] The users answered a ver-sion of the SUS survey translated into EuropeanPortuguese. [9]

3.4. The REST APIBesides the two main components, a REST APIwas also created, mainly to fulfil some requeststhat would put a heavy load on the client machine ifthey were computed by the Web application. Afterthe authentication process of NAPP was migratedto use FenixEdu (Section 3.5), another category ofendpoints was added. The endpoints provided bythe REST API can be divided into four categories:

• Authentication: endpoints relating to theFenixEdu authentication process

• Academic: endpoints to retrieve informationrelated to school, such as evaluations or stu-dent grades

• Analytics: endpoints to retrieve statistical in-formation about the usage of NAPP

• Mentoring: endpoints to retrieve or update in-formation about the MP-TP

3.5. FenixEdu AuthenticationAfter the end of the structured and unstructuredtests, another important improvement was madeto the mobile application. The authentication pro-cess became integrated with the Central Authen-tication Service provided by FenixEdu, the aca-demic information platform created by and used atTecnico Lisboa. This allows students to use NAPPwith their academic credentials. Not only is thismore convenient for students, who will not have tomemorise a new set of credentials to use NAPP,but it will allow to further integrate NAPP with theFenixEdu API in the future. After the student is au-thenticated via FenixEdu in the mobile application,the authorisation phase begins. A request is madeto the REST API which will verify if the authen-ticated user corresponds to a new mentor in thecurrent academic year (who would have been pre-viously enrolled by the MP-TP’s coordinator on theWeb application), or to a new mentee (who wouldhave been added by the student’s mentor using themobile application).

3.6. System architecture and deploymentIn the current system architecture (Figure 5), theserver no longer runs only CouchDB. The RESTAPI also runs on the server, and it answers re-quests from the Web and mobile applications.The REST API was developed in Node.js, and ituses PouchDB to communicate with CouchDB fordatabase access and sync.

8

Page 9: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

The Alerts feature of the Web application re-quires that a worker runs on the server. Thisworker was developed in Node.js and it also usesPouchDB to connect to CouchDB for database ac-cess and sync. The Alerts worker communicateswith Firebase Cloud Messaging for delivering pushnotifications to the coordinator’s device. FirebaseCloud Messaging is a cross-platform cloud solutionfor messaging provided by Firebase, a subsidiaryof Google.

The Web application has database accessthrough PouchDB – it keeps local replicas of mostdatabases, which sync with the remote serverdatabases. As for the mobile application, eachclient device running the application will have itslocal replicas of the databases, and PouchDB willtake care of keeping them in sync with the remoteserver databases.

Server

sync

HTTP Push  notifications

sync

Mobile Application

Web Application

Mentees & Mentors MP­TP's Coordinator

sync

HTTP

sync

Alerts (Worker)REST API

NAPP databases

CouchDB

HTTP

Chart.js

Figure 5: The current architecture of NAPP

The Server pictured in the system architecturewill actually be deployed as two separate servers.These servers are hosted by Tecnico Lisboa’s vir-tualisation platform, powered by OpenStack soft-ware. The first virtual server runs CouchDB ex-clusively, thus becoming the dedicated databaseserver. The second virtual server runs the RESTAPI, the Alerts worker, and also Apache WebServer software which serves the Web application.Each virtual server has a public IP address, mak-ing it accessible from outside of Tecnico Lisboa’scampi. Both servers have been configured withHTTPS for secure communications, with SSL cer-tificates provided by Tecnico Lisboa.

As of October 2018, the licences to publishNAPP’s mobile application on Google Play (An-droid) and iOS App Store (iOS) have been ac-quired.

Average SUS Score

Courses 95Students 93

Feedback and Dashboard 97

Average of all tests 95

Table 1: Average SUS scores obtained by the Web application

4. Results and discussionAs mentioned before, the Web application under-went a usability test made up of three parts, whereeach part tested a different feature (the last parttested the Feedback and Dashboard features asa whole). An acceptable average score on theSystem Usability Scale ranges between 65 and70. [10] Table 1 shows the average scores ob-tained by the features that were tested, which fallabove the range of acceptable average scores.While there is no intention to use these scores asa means to quantitatively evaluate the Web appli-cation and to make a definite statement about itsusability, one can say that the average SUS scoreshows high user satisfaction.

In general, users were fairly successful in com-pleting the tasks that were presented. However,there was one particular task in the Feedback andDashboard features that all five users failed – thetask consisted of looking up information in thedashboard, and answering a question; users werenot informed during the test if their answer wascorrect. As such, it is interesting to note that theFeedback and Dashboard features have the high-est SUS score among the three tests. This couldbe explained by the fact that the SUS survey is an-swered immediately after the users finish the test,before any feedback is given about the results.

As for the mobile application, 5 mentors and 11mentees participated in the unstructured test. Be-cause this was a very small number of students,it was not possible to do any meaningful statisticalanalysis about the use of NAPP’s mobile applica-tion. Given these circumstances, it was decidedthat the objective of this test was no longer to col-lect data to evaluate the value of NAPP to the MP-TP, but rather to get a first impression of how muchthe students would use the mobile application overan extended period of time.

Over the course of nearly three months (betweenApril 10th and June 26th, 2018), the students gen-erated 132 activity logs. Figure 6 shows the distri-bution of activity logs over time. It is possible to seethat there was higher activity during the first weekand that later it slowed down. During the remainingtime, students mostly reported grades and usedthe task list tool. As can be seen in the distribution,although there peaks in activity, students used themobile application at least once a week (except for

9

Page 10: Visualising Data in NAPP Web Application · Visualising Data in NAPP Web Application Catarina Alexandra Pereira Cepeda catarinacepeda@tecnico.ulisboa.pt Instituto Superior Tecnico,

the week between May 29th and June 5th). In afeedback session after the test ended, students re-ported that what they struggled the most with wasknowing when the mentor interacted with them, orwhen the mentee reported a grade or completed atask – this difficulty to remain engaged with the mo-bile application explains the peaks in activity seenin the distribution.

0

2

4

6

8

10

12

14

16

18

Figure 6: The activity logs generated during the unstructuredtest

5. Conclusions and Future WorkThe unstructured beta test conducted with a smallgroup of students over a long period revealed thatstudents engaged with the mobile application atleast once a week, except for one week. The us-ability test carried out on the new Web applicationallowed to diagnose some usability problems, butusers were generally very satisfied with the appli-cation.

These tests allowed to get NAPP into a statewhere it is ready to undergo a bigger and more for-mal test, in a real environment. The goal of thistest will be to evaluate whether NAPP contributespositively to the MP-TP by evaluating a key metric:“How many days have passed since a professorpublished an evaluation’s grades until the coordi-nator received them”. For this test, the studentswill be divided into two groups. An experimentalgroup will use NAPP’s mobile application and, tobe able to make a comparison with the traditionalmethod of the MP-TP, a control group of roughlythe same size will not use the mobile application.This test will include approximately 150 menteesand their mentors. This test will take place fromlate-October 2018 until January 2019 (end of thefirst semester).

Moreover, further enhancements should bemade to NAPP, including the implementation ofpush notifications in the mobile application to in-crease user engagement, as well as the implemen-tation of a score system of mentors’ activities bythe MP-TP’s coordinator.

References[1] F. David, A. Torres, and C. Pa-

trocınio, “Desempenho academico no cam-pus do Taguspark: impacto do projeto“Sucesso Escolar (NAPE-TP)” nos resul-tados do 1o ano,” 2014. [Online]. Avail-able: http://nep.tecnico.ulisboa.pt/download/Salas de estudo 201213 v24 02 14.pdf

[2] P. Veiga, “NAPP: Connecting mentors andstudents at Tecnico Lisboa,” Master’s thesis,Instituto Superior Tecnico, Universidade deLisboa, 2017.

[3] A. Dutt, M. A. Ismail, and T. Herawan, “A Sys-tematic Review on Educational Data Mining,”IEEE Access, vol. 5, pp. 15 991–16 005, 2017.

[4] “GPS Advising at Georgia State University,”2014. [Online]. Available: http://oie.gsu.edu/files/2014/04/Advisement-GPS.pdf

[5] Georgia State University, “GPS Advising.”[Online]. Available: http://success.gsu.edu/initiatives/gps-advising/

[6] J. Nielsen and T. K. Landauer, “A math-ematical model of the finding of usabilityproblems,” in Proceedings of the INTER-ACT ’93 and CHI ’93 Conference on Hu-man Factors in Computing Systems, ser.CHI ’93. New York, NY, USA: ACM,1993, pp. 206–213. [Online]. Available:http://doi.acm.org/10.1145/169059.169166

[7] J. Nielsen, “How Many Test Users ina Usability Study?” June 2012, Ac-cessed 3 October 2018. [Online]. Avail-able: https://www.nngroup.com/articles/how-many-test-users

[8] J. Brooke, “SUS-A quick and dirty usabilityscale,” Usability evaluation in industry, vol.189, no. 194, pp. 4–7, 1996.

[9] A. I. Martins, A. F. Rosa, A. Queiros, A. Silva,and N. P. Rocha, “European Portuguese Vali-dation of the System Usability Scale (SUS),”Procedia Computer Science, vol. 67, pp. 293– 300, 2015, Proceedings of the 6th Interna-tional Conference on Software Developmentand Technologies for Enhancing Accessibilityand Fighting Info-exclusion. [Online]. Avail-able: http://www.sciencedirect.com/science/article/pii/S1877050915031191

[10] A. Bangor, P. T. Kortum, and J. T. Miller,“An Empirical Evaluation of the System Us-ability Scale,” International Journal of Hu-man–Computer Interaction, vol. 24, no. 6, pp.574–594, 2008.

10