Open S-awesome

Featured image for sharing metadata for article

In December, an event called 24 Pull Requests promotes giving back to various FLOSS projects in an advent-calendar esque way.

I managed to make all 24 contributions, which can be found below:

A total of 16 (66%) of the contributions I made were through projects I've previously contributed to. In effect, a mere third of my contributions over the month were for new projects I found.

I found that looking forward during the month, I would collate a list of "easy" pull requests - these would be things I would much more easily be able to collect and get things sorted with. These would be some minor changes, such as adding links or pictures, and I would save them in a saved note, while browsing reddit or Hacker News threads. However, these minor changes, although they can be seen to be useful, didn't really add too much to the end user's experience, which is ideally what should be focussed on for the contributions we make.

Because of this, I found that I didn't really have many large contributions to be proud of, unlike something like Hacktoberfest. I say that, but there are a couple of contributions such as the use of htmlproofer and programatically listing contributors to

One thing you'll notice is that I made a number of commits to the HacksocNotts repos: 11 in total, in fact. This number is a lot higher than I would have hoped; but again, a number of these contributions are ones that we have previously discussed as a "nice-to-have" and as I had the chance to work on changes, it made sense for me to make them. However, the issue with this is that Hacksoc is all about the members learning - and I shouldn't be taking that away from them. Instead of making these contributions, I should have made them as issues, and then given the community the chance to learn for themselves how to go about doing these. Although I was personally able to learn a lot, it isn't up to me to be learning through the Hacksoc sites!

Overall, I am happy that I went through the experience, to see if I could, and what sort of level of commitment it would require. However, it's shown me how allocating time to side projects (on a deadline, no less) during a very busy month is not the best method of breeding strong contributions. I now know for the future not to push myself to hit a deadline with too high a limit, and instead realise that I need to aim for much more attainable goals.

PS: I would say that actually, the script that let me determine which contributions I made over the month was slightly more useful in some cases. It can be found below and will hopefully shed some insight into how the above Markdown list was generated automagically.

#!/usr/bin/env python3

import urllib.request
import json
import os.path
import re

JSON_PATH = "prs.json"

def flatten(xss):
    return [x for xs in xss for x in xs]

def get_page_of_events(username, page=1):
    format_url = '' % (
            username, page)
    response = urllib.request.urlopen(format_url)
    encoding ='utf8')
    return json.loads(

def get_x_pages_of(numPages, username):
    pages = []
    for pageNum in range(numPages):
        page = get_page_of_events(username, 1+pageNum)
        newPage = []
        for p in page:
            # is a PR
            if "PullRequestEvent" != p['type']:
            # opened PR
            if "opened" != p['payload']['action']:
            # TODO in December '16
    return flatten(pages)

def is_24pr_event(e):
    if "PullRequestEvent" != e['type']:
        return False
    if "opened" != e['payload']['action']:
        return False
    if not e['payload']['pull_request']['created_at'].startswith("2016-12-"):
        return False
    return True

def markdownify_repos(repos):
    regex = re.compile(r"", re.IGNORECASE)
    ret = ""
    for repo in repos:
        ret += "- [%s](%s)\n" % (
                regex.sub("", repo['repo']),
        for e in repo['prs']:
            ret += "    - [%s](%s)\n" % (
    return ret

def get_by_key(xs, keyName, key):
    for x in xs:
        if x[keyName] == key:
            return x
    return None

def sort_by_repo(events):
    regex = re.compile(r"/pull/.*$", re.IGNORECASE)
    ret = []
    # get the ret[?]['repo']
    # otherwise, create
    for e in events:
        repo = regex.sub("", e['payload']['pull_request']['html_url'])
        v = get_by_key(ret, 'repo', repo)
        if v:
            v = {}
            v['repo'] = repo
            v['prs'] = [e]
    return ret

if __name__ == "__main__":
    if not os.path.isfile(JSON_PATH):
        with open(JSON_PATH, 'w') as f:
            data = get_x_pages_of(10, "jamietanna")
            # print(type(data))
            # print(type(json.dumps(data)))

    jsonData = {}
    with open(JSON_PATH, 'r') as f:
        jsonData = json.loads(

    events = []
    for e in jsonData:
        if not is_24pr_event(e):

    repos = sort_by_repo(events)

Written by Jamie Tanna's profile image Jamie Tanna on , and last updated on .

Content for this article is shared under the terms of the Creative Commons Attribution Non Commercial Share Alike 4.0 International, and code is shared under the Apache License 2.0.

#retrospective #open-source #free-software #community.

This post was filed under articles.

Interactions with this post

Interactions with this post

Below you can find the interactions that this page has had using WebMention.

Have you written a response to this post? Let me know the URL:

Do you not have a website set up with WebMention capabilities? You can use Comment Parade.