Going Carbon Neutral

Here at CharityCAN, we help Canadian charities make positive change in the world by giving them the tools to fundraise more effectively.

But we don’t just want to do our part to improve where we live – we also want to make sure we’re not doing anything to make it worse.

That’s why this summer, we started to offset the company’s carbon emissions in an effort to become carbon neutral.

To achieve this goal, we’re working with a company called Patch to calculate the carbon that running our company and software produces and then purchase carbon offsets to reduce that carbon footprint.

As a small software company, we don’t produce physical goods, so there’s nothing to offset there. But we do run our software on cloud servers, which of course run on electricity. Most of our cloud infrastructure is hosted on Microsoft Azure, which has been carbon neutral since 2012. But we do have some servers running on other services that are not carbon neutral, so we’re offsetting those.

We’re also a remote-first company, so we don’t have a big office building to power. But we do have employees using electricity at their own homes, and we purchase and throw out computer equipment and other office supplies. To calculate how much carbon each employee uses, we borrowed from Basecamp’s estimates as they’re also a remote-first SaaS company.

For the time being we’re using Patch to offset our carbon by funding forestry projects. When our company gets a little bigger we hope to be more ambitious and start funding some carbon sequestration projects as well.

This is just a start – hopefully one day we can join some of the companies who have committed to becoming carbon negative. Until then, we can at least do our part to make sure we’re not putting more carbon into the atmosphere, and encourage other small businesses like ours to do the same.

Our Pandemic Recovery Summer

This summer at CharityCAN, we’re trying something a little different. We’re taking a break.

Well, that’s not quite true – we’re taking a lot of little breaks. 8 of them, to be exact.

This summer, we’re turning every weekend into a long weekend for our employees. Every Monday that isn’t already a holiday between Canada Day and Labour Day becomes a paid day off. No time to make up during the week, just a long weekend for every weekend of the summer.

I’ve been toying with the idea of a summer of long weekends ever since I read It Doesn’t Have To Be Crazy At Work by the folks at Basecamp. You can read up a little more on their four day summer weeks here.

But to be honest, I was a little frightened as the owner and operator of a small business. Would we really have time to finish all the things we need to get done? Would we fall behind somehow?

This summer though, the math has changed. We’re all tired and worn out after living through more than a year of a global pandemic. As we slowly emerge from lockdowns to summer weather and safer outdoor conditions, we need spend more time seeing and reconnect with the people we’ve been separated from for so long.

I can’t sit here and pretend that this is some brave decision or that I’m a trailblazer, either. It’s much easier to jump on the bandwagon when even larger tech companies like Hubspot and LinkedIn are giving their employees “burnout breaks” in the form week-long company shutdowns, or when Microsoft Japan goes to 4 day work weeks and sees a boost in productivity of 40%.

It’s still not going to be perfect – our co-op students have to work 35 hours a week to get a co-op credit, so they’ll have to work 45 extra minutes on each of the four working days (or flex those hours however they wish), but it’s as close as we can get for now.

I’m hoping that if the experiment goes well this summer it’s something we can look forward to here every summer from now on.

We here at CharityCAN hope you have a great summer, however you spend it.

And if you happen to email any of us on a summer Monday starting this week, here’s what you’ll get back:

Thanks for your email! This summer is a “Pandemic Recovery Summer” here at CharityCAN, which means that we’re taking every Monday off from Canada Day to Labour Day.

I’ll respond to your email when I’m back at my desk tomorrow.

Anonymous Co-op Interview Selection at CharityCAN

Anonymous Hiring

Back in the summer of 2020, with Black Lives Matter marches happening in every major city in North America, I was challenged to take a look at the diversity, equity and inclusion (DEI) practices at our little company. As anyone who works at a small business can attest to, there aren’t often official practices or procedures put in place until something goes wrong and forces you to create something to adhere to.

As a company that straddles the fundraising and software industries, we’re in a double whammy of fields that are predominantly white (in the case of the former) and predominantly white and male (in the latter). Our current staff is split 50/50 between genders, but we only have one team member who is racialized (or 12% of the company).

With that in mind, I contacted Lunaria, a local company that helps companies with their DEI practices. While taking me through some things to consider, Lunaria suggested hiring practices as one place a small company could look to reduce unconscious bias and make sure we’re finding the best candidates regardless of race or gender. While we don’t have any open job positions on the immediate horizon, we do hire a co-op student every four months to help on the software development team. I wasn’t sure how we would do it, but this term I made it a goal to use anonymous hiring, or blind recruitment – stripping away any identifiable information from job applications to reduce bias – while selecting co-op students to interview.


The Baseline

To start out, I was curious to see if I could pull together information about the co-op students from the University of Waterloo (UW) (where we hire our co-op students from). This would let me see whether or not we were attracting and selecting students that were more or less in line with the race and gender of the overall student body.

Using the Common University Data Set from UW , I was able to get a breakdown of students along gender lines from the programs we hire our co-ops from, Engineering and Computer Science (CS):

GenderEngineeringComputer ScienceBoth Programs
Male5,428 (71%)2,296 (75%)7,724 (72%)
Female2,220 (29%)767 (25%)2,987 (28%)

Race-based data was much more difficult to come by. It turns out that most Canadian universities do not collect race-based data, despite calls that say race-based data is necessary in order to address inequality and support students.

The closest thing I could track down was a demographic survey on the /r/uWaterloo subreddit. Despite all of the obvious issues with a self-reported survey from a small internet community, let’s take a look at the reported racial breakdown in the Math and Engineering faculties (note that this is slightly different than the Engineering/CS breakdown above – CS is only one part of the Math faculty at UW, but it still gives us some idea). The survey creators also posted a breakdown of the subreddit’s reported race vs. Ontario demographics in general, if that helps give some idea of who may be under/over-reported in the results.

Hat tip to reddit user fugbox (original image source)!

The Anonymous Selection Process

Now let’s take a look at our hiring process! Every four month term, we submit a job posting to the University of Waterloo’s co-op job portal, Waterloo Works. Students who are interested in the position post a PDF copy of their resume, and the university includes a student grade report and past co-op employer evaluations. I review these application packages and select students for the next stage, the in-person interview.

Following another of Lunaria’s suggestions, I asked our current UW co-op to help with the anonymization experiment, and they took the time to black out identifying student information in the applications (names, addresses, emails, etc.), leaving only their student numbers behind for reference.

Next came the selection process. I was surprised at how much my old (non-anonymous) process relied on names until they were gone! My brain was apparently trained to use a student’s name as a placeholder in my head, and with student numbers any sort of personality I might have built up completely disappeared. It made remembering which resumes I had already read a little difficult, but it’s easy to see how unconscious bias seeps in without you even thinking about it.

Something else I noticed was that it’s probably not enough just to scrub names and email addresses – next time I’ll probably scrub any “interests” from the resumes as well. They made it too easy to make a gender (e.g. “mixed martial arts” vs. “figure skating”) or race (“Chinese Student Association” vs. “Minor League Hockey Referee”) assumption.

Other than that, it was no trouble to whittle down the resumes based on the anonymized data.


Results

Here are the results of the anonymous selection process broken down by gender and race (these are just guesses – the only way for me to identify student race and gender was to use names and LinkedIn profiles from applications). Out of 48 initial candidates, I selected 7 students for one-on-one interviews. Here’s how that looks:

Assumed Gender# Students Applied# Students Selected
Male40 (83%)5 (71%)
Female8 (17%)2 (29%)
Assumed Race or Ethnic Background# Students Applied# Students Selected
Asian24 (50%)1 (14%)
Black1 (2%)1 (14%)
Latinx1 (2%)0 (0%)
Middle Eastern3 (6%)1 (14%)
South Asian14 (29%)2 (29%)
White5 (11%)2 (29%)

So in the end I ended up with a fairly diverse group of students. If you look along gender lines, the anonymous process selected a number of candidates that matched up with the applicant pool.

But after all that I still ended up with an over-representation of white students! This is a small sample size, so maybe it doesn’t mean anything, but I wonder if some unconscious bias was still happening – through the interests section of the resumes, or the format of the resumes. Maybe the fact that these students are white means they have had better co-op jobs or were evaluated on those jobs better in the past? I also ended up with a severe under-representation of Asian students. Again, I’ll have to see if this is some sort of bias or whether we’re just looking at a small sample issue.

When I compare our applicant pool to the student body breakdown, it seems like our applicants are more or less as racially diverse as the general student population, but it doesn’t look like we’re attracting quite as many female candidates as are in the general student population. We did end up with a pretty good representation in our selected candidates, however. Maybe we can update our future job descriptions to make them more inclusive.

In the end, the anonymous selection added negligible overhead and seems to have worked out so far! I’m looking forward to using it again next term and finding other ways to improve our process.