Skip to content

feat: track user events in Countly #282

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 23 commits into from
Aug 20, 2019
Merged

Conversation

terichadbourne
Copy link
Member

@terichadbourne terichadbourne commented Aug 6, 2019

closes #100

remove before merging:

  • test lessons (commit 47359c9)
  • commenting that makes clicks from localhost get tracked (commit d3ffda0) - overwritten by later changes
  • more test lessons (commits 42bf9bc &
    bb703e0)

WIP
Previously in place:
Heat map (track_clicks):
image

Added so far:
Link tracking by href (track_links, event linkClick, segment href):
image

Link tracking by linked text (track_links, event linkClick, segment text):
image

Scroll map for each page with color coding for what percentage of visitors see the portion of the screen (track_scrolls):
image

@terichadbourne terichadbourne added the docs-ipfs In scope for IPFS Docs Working Group label Aug 6, 2019
@terichadbourne
Copy link
Member Author

Segmentation of link clicks by view (what page you were on when you clicked the link) isn't yet working. All clicks are registering as being from the homepage (/) despite the fact that Countly can indeed distinguish between page views elsewhere.

image

@terichadbourne
Copy link
Member Author

@fsdiogo and I just implemented tracking for when a lesson is passed, which is calculated slightly differently for each:

  • exercise (incl & excl files): when the validation code shows a successful output
  • text-only: when you click "next" at the end of a lesson, which implies you've finished reading it
  • multiple-choice: when you select the right option in the radio

For each of these it excludes any automatic tests we do on page load, so as not to double-count successes caused by us. However, if a user manages to un-pass and then re-pass a lesson (by clicking off of a wrong answer onto a right one and then back again in multiple choice or by resetting and re-submitting code on an exercise), it will be counted as passed again.

The event lessonPassed currently has 3 segments:

  • path (combo of tutorial and lesson number, very useful)
  • tutorial (the tutorial shortname, less useful because if a tutorial has more lessons it will get higher results)
  • lessonNumber (pretty much useless since this combine results from a certain lesson number across multiple tutorials, unless we find a way to work with the dashboard to combine segments so that we can view by lesson and then by number, in which case it would be awesome)

image

@terichadbourne
Copy link
Member Author

@fsdiogo I realized I had left the code stuck on counting in increments of 20 so I fixed that. :)

I also added sample lessons of every type to my add-test-pages branch and then merged it in, so that we can try to do all of our clicking around to test analytics within the test lessons where we won't affect the real analytics. I also changed the resources there to be dummy ones so we can use it to test linkClick also without messing anything up. I've pushed add-test-pages to GitHub so that we can share it and merge it into other PRs at helpful times for testing. The test tutorial is at http://localhost:3000/#/tests and is not surfaced to the homepage or the tutorials page, just in case it accidentally gets merged someday.

@fsdiogo
Copy link
Collaborator

fsdiogo commented Aug 9, 2019

Just added metrics for when users reset or submit their code (the automatic runs when you enter a previously passed lesson are discarded).

@ericronne
Copy link
Collaborator

This rocks!

@terichadbourne
Copy link
Member Author

We've recently added event tracking for:

  • submitWrongCode (when the user hits submit and their code doesn't pass validation)
  • resetCode (when the user chooses to reset the contents of the code editor and start over)
  • submitWrongChoice (when the user selects an incorrect multiple choice answer - we note which wrong answer is picked - we'll need to figure out how to join the path segmentation with choice segmentation to make this useful)
  • tutorialPassed (when the passing of a lesson causes all lessons in the tutorial to be passed)
    Automatic evaluations of answers on page load are excluded from these numbers.

Some of these will require further work in the dashboard (hopefully) or manually (hopefully not) to make the data most useful. For example:

  • We could look at the submitWrongCode, resetCode, and lessonPassed numbers, all segmented by the same lesson path, to see how many wrong answers happen for every right answer in a particular lesson.
  • We also will need to figure out how to apply the wrongChoice segment on top of the lesson path segment to make the data on which wrong multiple choice was selected by the user useful, which will help us understand common misconceptions.
    For now we're focusing on capturing the underlying data we'll need to support such analysis.

tutorialPassed, segmented by tutorial:
image

We've also created a Vue data isTutorialPassed that can be used to update the UI as in the trophy emoji below granted for completion of all lessons in the tutorial (or however we decide to show progress through completed tutorials in the future):
image
This badge of honor can be earned without viewing the resources page, but we're now better prepped to address this issue for allowing celebratory tweets or saving to gists from the resources page when a lesson is completed: #243

@terichadbourne
Copy link
Member Author

With some help from the Countly team we figured how how to filter by one segment and then report on another, and I've mastered a few more things to create a ProtoSchool events dashboard that can do things like:

Compare completion rates for each tutorial:
image

Compare completion rates of lessons within one tutorial:
image

..and some additional "drills" and "formulas" that can't display properly on the dashboard, like:

Distribution of wrong answer selections for each lesson in a tutorial (to help us discover common misconceptions):
image

Percentage of correct attempts across all ProtoSchool lesson submissions/selections/resets (can't yet break down to view this for one lesson or one tutorial):
image

@terichadbourne
Copy link
Member Author

terichadbourne commented Aug 16, 2019

We're currently blocked by the Countly team, to whom Teri raised this issue today, asking for their help in getting it fixed:

If we get the latest Countly code from https://cdnjs.cloudflare.com/ajax/libs/countly-sdk-web/19.8.0/countly.min.js, which references version 19.8.0, the getViewUrl is now working for us and affecting the view segment reported with linkClick. However, we would like to use https://countly.proto.school/sdk/web/countly.min.js as the way to access your code to avoid CORS / cross-site issues, and if you look at https://countly.proto.school/sdk/web/countly.js (which we assume is the source for the minified file), it still lists version 18.08.2. (var SDK_VERSION = "18.08.2";)

The live site right now is using https://countly.proto.school/sdk/web/countly.min.js but we believe our event tracking won't fully work without that code being updated by their team.

@terichadbourne
Copy link
Member Author

Countly has updated the server version for us. @fsdiogo on Monday could you please update the source in this branch to https://countly.proto.school/sdk/web/countly.min.js and test that the view tracking works correctly for the linkClick event and that this removes the Cypress error that blocks deployment?

@fsdiogo
Copy link
Collaborator

fsdiogo commented Aug 19, 2019

@terichadbourne I've updated the code to fetch the countly sdk from countly.proto.school and the event tracking works. The tests are still failing, but they are failing in the code branch too, where they were passing. Not sure what's happening.

@terichadbourne
Copy link
Member Author

Can you help me understand this fix @fsdiogo? Under what circumstances does window.Cypress exist? Are we making it so we can never test Countly now, if Cypress is always there?

@fsdiogo
Copy link
Collaborator

fsdiogo commented Aug 19, 2019

window.Cypress only exists when we're running the Cypress tests, so this is killing two birds with one stone, so to speak: it fixes the CI and ignores the event tracking when Cypress runs, which is what makes sense!

@fsdiogo fsdiogo changed the title Track user events in Countly feat: track user events in Countly Aug 20, 2019
@fsdiogo fsdiogo marked this pull request as ready for review August 20, 2019 09:17
@fsdiogo
Copy link
Collaborator

fsdiogo commented Aug 20, 2019

@terichadbourne, can you do a last test run to check if this is ready to go?

@terichadbourne
Copy link
Member Author

@fsdiogo The segment "tutorial" was actually broken for all events because we merged the PR that changed the word "workshop" to "tutorial" but hadn't made the substitution in the new code. I've fixed that and also updated another counting function to use the startPad method.

Looks to me like we just need to remove the test lessons now, which I can work on.

@fsdiogo
Copy link
Collaborator

fsdiogo commented Aug 20, 2019

Oops, forgot about that one! Cool, take care of that so we can merge this one 💪

@terichadbourne
Copy link
Member Author

@fsdiogo Will you please do one last check to confirm I've removed all the test stuff and things are working on your end, then merge when Travis passes? Thanks!

Copy link
Collaborator

@fsdiogo fsdiogo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚀🌔

@fsdiogo fsdiogo merged commit a6e1770 into code Aug 20, 2019
@fsdiogo fsdiogo deleted the feat/countly-event-tracking branch August 20, 2019 14:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs-ipfs In scope for IPFS Docs Working Group
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Collect metrics on UI events
3 participants