Blog

January 07, 2022 22:53 +0000  |  Free Software Linux Majel 1

For the last two years I've been working on Majel, a project that allows you to control your computer with your voice. The first incarnation was released back in March, but was dependent on Mycroft, so I've been working to rewrite it to be independent. The end goal: to be able to release it as an image file for the Raspberry Pi so people can just download it, burn it onto an SD card, pop it into their Pi and have it Just Work™.

The development process has been surprisingly easy, with only a few hiccups around audio processing in Python. The new architecture has proven to be really solid and I'm excited to share that in detail at a later date -- but that's not what this post is about. This post is about packaging for the Raspberry Pi and what a nightmare it's been for me.

For the purposes of this post, you just need to know that I wrote a Python application that interfaces with GNOME, Firefox, Chromium, Skype, and other desktop tools to do cool stuff.

What follows is a series of things I now understand that came at the cost of a lot of time and hair-pulling. If you're thinking about going down this road for your own project, my hope is that sharing my experiences here will help save you frustration in the future.

The CPU Architecture

Anyone who knows anything about the Raspberry Pi project can tell you that these little devices don't run the same kind of CPU you're probably used to. Where most computers we use today (not including phones) use x86 processors (typically built by Intel or AMD), the Raspberry Pi uses ARM chips. If your knowledge of the situation (like mine) ended there, then I'm about to save you some pain.

Just like the x86 ecosystem (which consists of i386, i686, x86_64 and other "sub-architectures"), the ARM family includes a wide variety of architectures which you need to build explicitly for when you're making your own stuff. For example, if you've got a Python wheel labelled aarch64 it will only run on 64-bit ARM systems, while one labelled armv7l will run on 32-bit ARM systems.

The Raspberry Pi's hardware 4 can run both but the default "Raspberry Pi OS" is 32-bit and exclusively runs armv7l binaries. If you want to use aarch64, you must install an OS other than the default.

Python Support

In the Python world, the vast majority of packages on PyPI are "pure-python" (ie. they will run on any system already running Python). However there's a lot of packages out there that're bundled with some compiled code (usually C or C++). These packages must be compiled exclusively for your architecture in order to run and if your architecture isn't supported with a pre-existing build, you either have to build it yourself (painful, especially on a Pi) or you're shit out of luck.

For example, the popular cryptography library is not pure python and therefore must be compiled for the architecture it's running on. Thankfully, that project's maintainers support a variety of platforms but note that armv7l isn't one of them.

In fact, finding a package on PyPI with support for armv7l is actually quite rare. Instead, Raspberry Pi users have a special "hack" on their system (one of many I discovered in my travels), an additional Python repo, enabled by default: piwheels.org.

If you're running Raspberry Pi OS, you'll find that nearly all of your not-pure-python packages are not coming from PyPI at all, but are rather coming from piwheels.org: a repository of .whl files, built exclusively for the Raspberry Pi. This is pretty great, though it was definitely a surprise.

If however you're not using Raspberry Pi OS and are instead using an aarch64-based OS like Manjaro, then there's no piwheels.org for you. Instead, you have to hope that the package you need has pre-built support for your architecture. Thankfully, aarch64 is much more common in PyPI, but it's not everywhere. The vosk package for example has armv7l packages but not aarch64 ones.

Finally, Poetry has an annoying bug/limitation in it that means you can't configure your pypackage.toml file work across architectures. Your poetry.lock file will only store hashes for one architecture at a time, so if you run poetry update on an x86_64 machine, the resulting poetry.lock will be entirely different from one generated from an aarch64 machine. As this undermines the whole idea of a consistent, distributable, versioned lock file, it's rather disappointing.

The Operating Systems

So now that we know a bit about the limitations of Python in different operating systems running on the same hardware, let's talk about those systems in more detail

Raspberry Pi OS (formerly Raspbian)

Raspberry Pi OS is Debian-based, but critically it is not your typical Debian system. In an effort to make using the Pi easy for everyone from children to seasoned professionals, the Raspberry Pi Foundation has applied a lot of tweaks and hacks to standard Debian which can catch you off guard if you're not ready for them.

Old & Busted Software

Like any Debian system, everything is old-as-fuck as the maintainers prefer stability over modern features. If you're using your Pi to control humidity in a greenhouse, this is probably a good idea, but if you're hoping to take advantage of modern graphical user interfaces, you're going to have a bad time.

For example, the current version of GNOME available for the Pi is two versions behind the GNOME project's release schedule and it's a good bet that that gap will grow with time. As for Firefox, the most recent version you can get is Mozilla's "extended support release" (ESR) which is a nice way of saying "we promise to support this version for years and years but it won't be meaningfully updated during that time*.

What's more, simply installing GNOME on a standard Raspberry Pi OS image absolutely will not work because there's something called pi-package installed by default that claims to have installed an inferior version of gnome-settings and that conflicts with the would-be-installed version. You must instead use a "Lite" version of the image (the one that doesn't come with X or LXDE) and then install GNOME from there.

Special Configuration Pattern

Configuration of the Pi is done with a program called raspi-config which is installed by default, but if you're using a Pi 4, most of the options you can select in this tool will fail to apply.

As best I can tell, Bluetooth is entirely broken from the start. None of the usual patterns I would expect to get it working (like running systemctl start bluetooth and opening the Bluetooh UI) resulted in success. This is not a hardware problem, but a software one. I can only assume that there's some special Raspian way to do this.

Non-standard Re-packaging

Chromium is a first-class citizen in Piworld, installed by default on the standard image, but strangely listed as chromium-browser rather than the usual chromium. You can even get Widevine support in it (so you can watch encrypted video on Netflix & Prime) simply by running apt install libwidevinecdm0. This deviates from what you see on a typical Chromium install, since modern versions of Chromium allow you to download Widevine support automatically. I can only assume that this is a special concession for the armv7l architecture.

Widevine support in Firefox appears to be impossible.

Kodi has been compiled to exclusively run without an X server or Wayland present. Undoubtedly this is to allow Pi users to just install Kodi and run it without the overhead of a UI they aren't using, but if you want that standard overhead, you're SOL.

Building Your Own Image

If your goal, like mine, is to distribute your app as a Raspberry Pi image, then you'll want to look into pi-gen, an automated system that let's you build a Pi image on an x86-based machine. It's impressively simple, but critically only runs on Debian-based systems. If you're running Fedora, Arch, or some other system, they have a Docker-based runner, but I couldn't get it to work. To get it working on my Arch system, I started a Debian VM and and it worked beautifully... after consuming a whopping 42GB of disk space!

My original idea what to build Majel automatically via Gitlab's CI. With a disk footprint like that however, I'm afraid I'm going to have to rethink that idea.

Manjaro

After running afoul of all of the above, I started looking into alternative base images to work with. Thankfully, Raspberry Pi's excellent Pi Imager (available on FlatHub) makes the burning of alternative images super-easy, and I found Manjaro Linux (a flavour of Arch Linux) to be a really good starting point for my project. In fact, there's a GNOME variant available so you can burn an image that boots into GNOME shell!

As an Arch-derivative, it runs really close to the bleeding edge, so installing a modern version of Firefox and Kodi was super-easy. There were a few surprises though.

It's Not the Same Architecture

While Raspberry Pi OS is running on armv7l, Manjaro builds all of its packages for aarch64. That means that piwheels.org is out of the question, and that there's still going to be some Python packages that aren't published to PyPI with support for Manjaro on a Pi (looking at you vosk).

Wayland is the Default

It's a "new hotness" sort of OS, which means that the default UI server isn't Xorg, but Wayland. For most people, this is probably ok, but for me, since my project relies heavily on Xorg (Majel uses pyautogui which can't do Wayland) this was a problem. Thankfully, you can switch to using Xorg simply by installing xorg-server and uncommenting the WaylandEnable=false line in /etc/gdm/custom.conf.

Widevine is... Problematic

While getting Widevine support in Raspberry Pi OS is easy, getting it working in Manjaro is pretty sketchy. Sure you can install modern versions of both Chromium and Firefox and they work great, but Widevine isn't there, and it won't autodownload, even in Chromium.

Instead, you have to install this crazy/amazing package called chromium-docker from the AUR. The installation process builds a local Docker image of Ubuntu wherein you install Chromium and you can take advantage of the aforementioned libwidevinecdm0. Running it from that point forward involves starting the Docker container and running Chromium from inside it. That's just... bananas.

Packaging is Tricky

The easiest way to make my project installable on Arch-based systems is to contribute an AUR package, but writing one that will install properly on both aarch64 and x86_64 systems was surprisingly not straightforward.

All the docs you read will tell you that there's one variable you set for package sources, conveniently called source=(). What took far too long to find was that you can actually suffix this variable name with the name of the architecture: source_aarch64=() and source_x86_64=(). You then do the same for the sha512sums=() variables and finally, you write some sketchy if/else Bash in your package() function to check if ${CARCH} is equal to aarch64 or x86_64 etc. Have a look at what I had to do for the vosk library if you're curious.

Creating Your Own Image Looks Easy

Manjaro has all of their OS builds available on GitHub, so from the outside it looks like making your own build should be easy. I haven't tried it yet though, so I can't comment.

Everything Else

With the exception of the above, working with Manjaro on the Raspberry Pi is delightful. Getting my Flic button paired with the Pi via Bluetooth was 100% painless and straightforward, and the OS in general has all sorts of nice creature comforts built into it, like zsh by default, a pretty drop-in replacement for cat, and a nice set of custom icons.

Ubuntu

Finally, there's Ubuntu, which admittedly I actively dislike. The whole proprietary Snap system, the ugly re-skinning of GNOME, the dependence on Debian unstable under the hood so everything is both old and broken... Ubuntu is everything I don't want in Linux under one roof. It's also hugely popular though, and likely the only place I'll be able to get Widevine easy out-of-the-box.

The first time I installed it, it locked up the mouse and keyboard for minutes at a time during the initial setup phase. As I write this, I'm still waiting for the initial boot to finish and the mouse is frozen on the screen. I'm not confident that my desire to see this work will be strong enough to overcome my contempt for this distro.

In General

The Pi is marketed as a tiny computer that you can leverage to do anything your heart desires provided you have the time, patience, and are comfortable with a low-power device doing the lifting. The question is though: is something as complicated as a voice-activated desktop automation system that plays streaming video even possible on hardware as limited as a Raspberry Pi?

It turns out, it's totally doable. In Raspberry Pi OS, I managed to bring up simultaneous instances of Firefox and Chromium and play "The Witcher" on Netflix by way of voice command. All processing, even the speech-to-text handling was done on-device and the performance was admirable.

The only caveat I will mention is that streaming video at full screen will absolutely not work at 4K resolution. In fact, I didn't get anything resembling a good framerate until I bumped the resolution all the way down to 1280x720. For my purposes though, this is completely reasonable: this is basically a very smart television after all and the quality of stream I get from Amazon Prime is abysmal anyway.

Conclusion

As long as this post is, it isn't even the end of my development process. I still have to give Ubuntu a fair shake and decide which of the above will be the reference platform for Majel. It'll install just fine on x86-based systems, but as the Pi is what I always envisioned for it, I want to get this part right before I officially "release" the new Mycroft-free version 2.0. Hopefully that'll be sometime in the spring, as I only have a few hours a night to work on it.

Until then, maybe the above will be useful to someone. If it was, please leave a comment! If it wasn't and you have questions, feel free to ask :-)

May 11, 2021 22:51 +0000  |  Free Software Majel 0

I'm going to try to do a better job of recording my development process here, and to that end, I wanna talk about my latest Big Shiny Idea.

My Majel project was mostly well-received, but the most common piece of feedback I got was how hard it was to install. The dependency on Mycroft really kicked the project in the nuts because Mycroft isn't really user-friendly and the company behind it doesn't appear to be prioritising that aspect because they intend to make their money selling physical devices.

On top of that, Mycroft isn't exactly ideal for this sort of thing, since it's operating on the same premise as Alexa & Google Home: it's actively listening for commands which means there's a constant battle between being able to hear said commands and you know, playing music. I'm just tired of saying "Hey Mycroft... HEY MYCROFT" and getting nothing because it can't tell the difference between my voice and the one on the tv.

So I have a new plan, with a new architecture, and dreams of usability. Fun stuff!

At the moment in my head there are at least 3 components: the "orchestrator", the cloud service, and the remote.

The Orchestrator

This is basically what Majel already is, but I'm going to refactor it to leverage pyautogui and control your whole desktop rather than just your browser.

The Remote

No more yelling at the screen. You have an app on your phone consisting of a two buttons: listen and stop, along with the possibility of a few presets you use often. These buttons send messages to the orchestrator to tell it what to do.

There's no reason this has to be just a mobile app though. It can be a tap on your watch, or an IOT button if you like.

The Cloud Service

Unfortunately, after a lot of digging, I've found that there's no (easy) way that you can have a web app or mobile app talk to an unencrypted websocket. There's restrictions built into most platforms that block that sort of thing, so you have to encrypt the traffic, and the easiest way for non-nerds to do that is to use a 3rd-party service. This is that service: a dumb relay between remotes and orchestrators that itself cannot impersonate a remote (for obvious security reasons). This cloud service could be self-hosted of course, but for new users, or people who just don't care about that sort of thing, remotes & orchestrators will connect to majel.danielquinn.org.

So that's the idea. I'm still in the planning phase, but I've got a lot of energy behind me on this for the moment. We'll have to see where that leads. For now, here's the diagram I worked out tonight:

January 03, 2021 21:16 +0000  |  Economy Employment Free Software Health Politics Software 0

This year sucked. That line is probably enough to remember the nightmare that is 2020 when I'm (hopefully) looking back on this post in 10 years, but as it's my tradition to go into depth on the past year at the start of a new one, let's go a bit deeper into the why this year sucked so much.

The Pandemic

This was the year that the COVID-19 pandemic took off. Lockdowns all over the world started around March and for the more civilised countries (New Zealand, Taiwan, a few others) that was the end of it. The rest of the world however could not get our shit together.

From the talks of "natural herd immunity" to the politicising of the virus and its prevention as a left-wing conspiracy, nearly every country failed to do the right thing in the most calamitous way possible.

It's left the people with a sense of reason exhausted. I mean, we have experts in this field. Those experts told us what we needed to do to stem the spread. Our leaders overwhelmingly did not heed that advice and chose instead to let 1.8 million people die (so far).

Even while mass graves were being dug in New York, leaders in nearly every nation were refusing to even close the schools. Here in the UK, (home of the famous "take it on the chin" comment by our fearless leader) we had policies that actually encouraged people to eat out at local pubs, and no mask mandate. Now the UK wears the dubious distinction of being the source of a much more virulent strain of the virus. Other countries have closed their borders to us, but nearly all continue with anti-science policy that inevitably leads to more death.

Vaccine Development

There's some good news though: 3 promising vaccines have made their way through a (very rushed) development & testing process to be cleared for emergency use in Europe and North America (and presumably elsewhere). The roll out has (unsurprisingly) been a mess here in the UK, and now there's talk of actually mixing-and-matching the vaccines which sounds insane to me, but again, unsurprising given the kind of leadership this country has.

From my (admittedly ignorant) read of the science behind this though, I'm currently on-board with getting a vaccine (or a "jab" as they call it here) when it's made available to me. As I understand the risks of so-called "Long COVID" vs. the nature of an mRNA vaccine, it's still a smart move in my mind.

Radicalised

Was 2020 a “bad year” or are we simply approaching the inevitable conclusion of living under an economic system that is fundamentally incompatible with human dignity and happiness?

Throughout all of this, I've become more "radicalised". My contempt for capitalism is more palpable, and I'm angrier every day.

All of this, all of this is a direct result of capitalism. From the Chinese government refusing to crack down on wild/exotic animal wet markets, to the world's pandering to their carelessness, to their covering up of the outbreak until it was too late, to the world's reluctance to close the borders, to anti-science policies in nearly every nation treating the working public like expendable peasants. All of it is driven by capitalism:

China

We've continued to trade with China and support their economy because it's profitable for the rest of us. It doesn't matter that they commit genocide or are among the worst polluters on the planet. We pretend that this is only their problem when logically we know that it isn't. The same is true for their public health regulations.

We knew that China's public health policy was a breeding ground for pandemics. We've seen it before. But isolating them? Punishing them for being a threat to world health? That would affect our profits.

And so we did nothing and China acted exactly as everyone knew they would.

Management once the pandemic started

The science was clear on all of this:

  • Close the borders
  • Close the schools, the churches, the markets, and the malls
  • Limit travel
  • Limit the spread by keeping people at home
  • Track and trace infected cases

But we all had rent and mortgages to pay. Around 300 million of us (the Americans) couldn't even have medical care if they were unemployed. How could anyone possibly do the right thing and follow the science?

Our governments could have stepped in. They could have put a moratorium on rent and mortgages. They could have mandated the expansion of grocery store delivery networks and required that no one be permitted to go to work if that work is not directly involved in a key industry like the food supply, public health, utilities, or the military.

The right thing would have been to do this for just a month or two and get a handle on the virus. Limit its spread and understand its behaviour. It could have been financed through a wealth tax or some other fiscal tool levied against those profiting from the pandemic.

We didn't do this though, because capitalism demands that we all go to work doing jobs that don't really matter so that the very rich few continue to accumulate wealth. It's a given that millions will die, but it's also understood we're all replaceable.

Disaster Capitalism

All of this is what Naomi Klein calls "disaster capitalism": the idea that disasters are leveraged (if not also created) by people who profit from them.

There are absolutely winners in all of this: Amazon and Tesco for example both posted record profits while exploiting their workforce. As The Guardian pointed out:

Bezos has accumulated so much added wealth over the last nine months that he could give every Amazon employee $105,000 and still be as rich as he was before the pandemic.

None of this is to say that there's some sort of illuminati cadre of rich assholes running the world. Only that the world is as it is because these sorts of people profit from it the way things are rather than how we all know they should be.

We don't need 2¢ USB sticks from China or next-day delivery of slippers from Amazon. We need a universal basic income, nationalised health care, and a government that understands the economy as a system of land, water, and people rather than currency.

This pandemic has happened entirely because we have prioritised personal wealth over humanity.

It's not just a bad year

Towards the end of the year, it became fashionable to refer to how we'll all be glad that 2020 is over, because somehow everything was going to be better in 2021. Nothing has changed though, and so even if the vaccine is rolled out smoothly and the pandemic subsides, all of this — in one form or another — will happen again because that is what this system was designed to do.

The worst is yet to come. Next up we're looking down the barrel of a crippling depression and the appallingly inevitable climate catastrophe. The skies above California literally turned red this year, and yet that nation still has no salient climate plan. The world community has done little more than talk about how we should probably do something, but fossil fuels are still subsidised by nearly every industrialised nation.

There's a reason you feel like things have only been getting worse: they have. Disaster capitalism is as much about profiting off of disaster as it is about demoralising the peasantry and keeping us fearful. We've been "holding on" for so long, hoping for things to get better when they absolutely will only get worse so long as we live under this system.

In Other World News

Despite the pandemic, there were a lot of things that happened worth noting that happened this year:

Black Lives Matter

George Floyd was murdered by a police officer and the country, the world was (finally) enraged. From what I've been hearing, very little has come of the rage though, as the pandemic has made mobilisations difficult. Still, calls for defunding or abolishing the police are finally being taken seriously, so that's a start.

Trump

Trump made it through all four years and got clobbered in an attempt at re-election. I maintain that if this pandemic hadn't happened, he would have won a second term (I have that little faith in the US), but with more than 350,000 dead so far and millions losing their jobs, there was no way he was going to win in a fair fight.

The question then was how much would the Republicans have to cheat to win this one, and they did their best: everything from gerrymandering, to restricting access to voting places, to sabotaging the postal system. None of it was enough to give Trump a win, though it may well have been enough to hold onto the Senate. We'll know in a few days with the Georgia run-off vote.

Oh, and there's widespread claims that the election was somehow fraudulent, and that Trump was actually the winner. This has led to Trump-devotees holding (maskless, of course) rallies calling for the arrest of Joe Biden.

And one more thing: Q-Anon is a thing now. There's a lot of overlap between these nuts and the nuts claiming that Trump actually won.

My Life, Directly

In comparison to any of the above, my life doesn't exactly feel significant, but this is my blog, so I'm going to cover that too.

Lockdown

The (limited) lockdown we had here in the UK was rough. I was just holding onto my sanity, being able to send my 1 year old away to the child minder during the work-week, but when that was all cancelled, Christina and I became full-time babysitters while also being full-time employees.

We "managed" this by working in shifts. I would work 4 hours while Christina looked after Anna, then I'd take care of Anna for four hours while Christina worked. When Anna napped midday, we'd both work, and when dinner came around, one of us would cook while the other took care of the kid, then she'd go down and both of us would go back to work 'till 11 or midnight at which point we'd go to sleep only to repeat this... for the entire month.

I won't complain though. It was hard, but at least we remained employed through the fortune of having remote-friendly work. I know that a lot of people in this country were looking down the barrel of no income and substantial rent to pay, so I know that we've been very fortunate.

Our childminder was freaking out when she heard the news that she couldn't keep her doors open, since no kids meant that her income was suddenly reduced to £0. Christina and I decided however that so long as our employment situation didn't change, we would continue to pay her as if Anna was in full attendance as usual.

Fear

The worst part of this though — at least for me — as been the looming fear. Yes the odds of death are low, but they're still very high compared to almost anything you would choose to do on a daily basis. On top of that, the long-term health effects of COVID-19 are almost entirely unknown. There are reports of cramps and migraines lasting months, and permanent heart damage, so this isn't something anyone wants to get.

My parents are both very high-risk, and yet they continue to have regular visits with my brother who flies all over Canada for work. It doesn't help that my brother's attitude toward COVID is more dismissive than anything else.

Personally I've had breathing concerns for years ever since I contracted pertussis in my late teens. Every time I've had a bad flu since then, there have been moments where the coughing and seizing locks up my whole respiratory system and I literally can't breathe. In those moments, I'm taken back to that year where whooping cough was destroying my lungs and I think that maybe this time will be the last... and then it subsides.

...and that's the flu.

I may talk a big game about the macro-level implications of this thing, but I'm honestly — personally — worried.

Christina is less concerned (which doesn't help with my own fears). She's frustrated by the way this year has likely stunted Anna's social development, how we see our friends so rarely (always outside, at a "safe" social distance), and she remains (rightly) concerned about the way the vaccines have been rushed through, and how public health is once again being politicised: you're either happy to give your 2 year-old a vaccine that's never been tested on 2-year-olds being rolled out by a government with a demonstrated lack of interest in public health, or you're an idiot anti-vaxxer who hates Britian.

There's a lot of stress to go around.

Goodbye Workfinder, Hello MoneyMover (again)

On the corporate front, I said goodbye to Founders4Schools/Workfinder back in November, and while I'll miss a lot of the people there, I won't miss working there for a variety of reasons.

For the last 2 months of 2020, I went back to MoneyMover to help move some of their codebase forward. I'd been helping to keep things running in my off-hours for the last 2 years, but there were a lot of things that needed more dedicated attention, so I agreed to come back for a short stint to help out. It's a great place to work, so I've really enjoyed being able to work with with everyone again.

Later this month, I'll be moving onto my next full-time job, this time with LimeJump. That move warrants an entirely separate post though, so I hope to get to that soon.

Majel

Finally, the best news (for me anyway) this year was the "launching" of my latest side project, Majel. I won't be announcing it to the nerd world for a few days still, but I'm really happy with how it's turned out.

Majel is a front-end for Mycroft, an OpenSource Alexa replacement. Imagine being able to "install" Alexa on your laptop or a Raspberry Pi and know that it does what you want without eavesdropping on your conversations. Mycroft even sells dedicated devices that do the same thing (just like an Echo), again, all Freely licensed so you can extend it in any way you like.

Majel is one such extension, my add-on to the Mycroft system that allows you to control a web browser with voice commands. Sure, maybe Alexa can control a "smart" TV and play shows from Amazon Prime, but it's unlikely that Amazon will also let Alexa control Netflix, let alone a local library stored in something like Kodi.

So I wrote Majel to do just that. You can say stuff like:

Play The West Wing

and it'll look at your local library and play those files if you have them (remembering where you left off of course). If you don't have them, it'll ask Netflix & Amazon who has the show and then play it with the service that does.

It also does stuff like:

Youtube baby shark

Where it'll look up "baby shark" on Youtube and play the first search result, full-screen and on a loop. Anna was thrilled.

Finally, it plugs into my Firefox bookmarks to do handy things like:

Search my bookmarks for chicken

Where it'll draw up a touch-friendly web page full of chicken recipes from my curated collection.

It's all licensed under the AGPL and regardless of whether or not there's much interest in it, I'll likely continue to develop on it. I want to be able to tell it to do basic web stuff, like do a Google/DuckDuckGo search for something or pull up a Wikipedia page on an arbitrary topic. I also want to get it to a point where I can say:

Call the parents

and have it start a video call, but that'll likely require working with something like PyGUI, so it may be a while before I can figure that out.

Anyway, I'm really happy with it, and it represents the culmination of roughly a year's work, squeezed into my off hours after Anna's gone to bed and when I'm not already expected to do some off-hours contracting. I'm hoping it'll show the Mycroft project a way toward making these digital assistants a more visual experience, but even if it flops, I'm still happy to have it running on my old Surface Pro 3 in the kitchen.

April 18, 2019 11:04 +0000  |  Free Software 0

A while back I made a small contribution to GitLab and they were so appreciative that they sent me a free mug which I then tweeted about.

This tweet was rather popular, as it was re-tweeted by a bunch of GitLab contributors and staff, and among a few thank-yous, I received one private message from someone asking about how easy it was to contribute and if I had any tips about the process.

As I've done this a few times (mostly as one-offs) and have a few ¹ ² ³ Free software projects out there myself, it turns out I did have some pointers. I thought it was worth sharing them here.

  1. Respect the requests of the project. If they have a coding style, follow it as carefully as you can. They may come back with requests for changes to conform to the style guide. Just roll with it and adjust your code. For large projects especially it's important that all contributed code conform so that the total project doesn't end up looking like a Frankenstein of different styles.
  2. Don't go big (at first anyway). Make your first merge request a small one that fixes a simple thing and/or adds a simple feature. If your changes introduce new functionality, make sure that your merge request includes a test or two to support it. If you don't include a test, it's very common for the maintainer(s) to request one as tests (a) help others understand what your changes are supposed to do, and (b) ensures that other people's changes don't accidentally break your stuff down the road.
  3. Be accommodating. This overlaps a bit with 1 and 2, but basically the thing to remember that while you think of your addition as a gift (and it is), it's also a burden to the maintainers. While your code may fix a bug or add something awesome, if it's hard to maintain/understand, doesn't come with tests, doesn't conform to the established style, or some-other-thing-that's-important-to-the-maintainer(s) then you're introducing pain rather than offering something valuable. In a well-maintained project (like GitLab) the maintainers will be friendly and responsive and work with you to get your merge request into a shape that's compatible with the long-term goals of the project. Work with them to do what's needed. Making your first merge request is just the first step toward actually getting your changes merged.

As for the technical part, this is a pretty good process for any merge request to any project (though admittedly I didn't follow this for this one merge request to GitLab as it was just a documentation fix):

  1. Check out the code locally and get it running. Depending on the size of the project, you may not be able to get all of it up, but at least get the part you want to change/test.
  2. Run the tests and make sure they pass.
  3. Create a separate branch off of master (or whatever branch the project asks you branch from)
  4. Make your changes.
  5. Add some tests to confirm your changes (you may want to do #4 before this one).
  6. Run all the tests together to make sure they still pass.
  7. Commit everything. Some projects ask that you break up everything into logical commits, while others ask you collapse everything into a single commit. Check the rules for each project to see they have such a policy. If not, use your best judgement.
  8. Make your merge request!

April 01, 2014 23:24 +0000  |  Free Software Politics 0

I had a bit of a revelation the other day. I realised that we live in an age where state power and secrecy is so fragile that governments are completely at a loss regarding what to do about it. All it takes is for one person with a conscience to copy a file and push it out onto the Internet and that's it: everyone can know the truth, and no one can stop it from circulating.

We've seen this at play with Wikileaks, and more recently with Edward Snowden, and the public world wide overwhelmingly supports the work of these individuals and organisations in bringing to light truths that need to be told. This amazing future we're living in, that allows us to practise journalism of conscience exists solely because of a concept we often refer to as the "Open Web", and it's under attack.

Mozilla screencap

Without getting too technical, there are concerted efforts by governments and corporations to break down the Open Web into more manageable chunks. Governments want a means to control what data gets distributed, copyright holders want a way to monitor what you share and with whom, ISPs want to turn the internet into a series of channels so that they can charge for so-called premium services, and then there's companies like Netflix that actively work to privatise the web itself, pushing for proprietary, closed standards.

Our Free and Open Internet is being attacked on all sides and no one has been a greater ally for the public good in this fight than Mozilla. Yes of course, The Pirate Bay has been a champion of our rights for years, and Google has our back... sometimes, but when it comes to a reputable, reasonable, non-profit voice in the one area that counts the most for this point in our evolution, Mozilla is it.

Mozilla's decision to promote Brendan Eich to CEO was not an endorsement of his politics, but it's still terrible for many reasons, not the least of which was the foreseeable backlash they had to have known they would face from the community who have come to love, trust, and promote Mozilla over the years. His presence in the Big Chair undermines all of that goodwill, and his unwillingness to seek some sort of reconciliation with the community only serves to damage relations further.

But this business of boycotting Mozilla, driving people away from a Free and Open Internet and into the hands of private and government interests, this is not good for anyone but Google, Apple, and the NSA. We need to prioritise the public good, over our distaste for one man's bigotry. We should condemn Mozilla's decision, but acknowledge that they're still our closest ally in the fight for a Free and Open Web.

Brendan Eich should be mocked publicly for being lost on the wrong side of history and Mozilla's board should question its decision to put at the helm of their company someone who has cost the company so much in the eyes of the public, but if we're going to continue fighting for an Open Web, we need to acknowledge who our friends are.

Hint: it's not Google.

June 15, 2009 19:07 +0000  |  Activism Drupal Free Software Linux PHP Software Technology Work [at] Play 0

I attended my first ever OpenWeb conference yesterday and as per company policy, I have to report on and share what I learnt, so what better way to do so then to make a blog post for all to read?

General

OpenWeb is awesome. It's a conference where people from all over the world come to talk about Open design and communication and hopefully, learn to build a better web in the process. Attendees include programmers, entrepreneurs, designers, activists and politicians all with shared goals and differing skillsets. I shook hands with Evan Prodromou, the founder of identi.ca and WikiTravel, heard talks from the guys who write Firefox and Thunderbird as well as the newly-elected representative for the Pirate Party in the European Parliament, Rickard Falkvinge. All kinds of awesome I tell you.

Rickard Falkvinge: Keynote - On the Pirate Party

Founder of the Pirate Party in Sweden and now a representative in the European Parliament (thanks to proportional representation), Falkvinge was a passionate and eloquent speaker who covered the history of copyright, the present fight for greater control of so-called intellectual property and more importantly the far-reaching and very misunderstood effects of some of the legislation being passed to "protect" copyright holders while eliminating privacy rights for the public.

The talk was very in depth and difficult to cover in a single post so I encourage you to ask me about it in person some time. For the impatient though, I'll try to summarise:

The copyright debate isn't about downloading music, that's just a byproduct of the evolution of technology. As the printing press gave the public greater access to information, so has the Internet managed to disperse that information further. The problem is now that the changing landscape has rendered certain business models ineffective, these business are fighting to change our laws to preserve said model rather than change with the times. Ranging from the frustratingly shortsighted attempts to ban technologies that further file sharing (legal or otherwise) to the instant wire tapping on every Internet connection (and by extension phone call) of every free citizen without a warrant, many of these changes are very, very scary.

"All of this has happened before, and it will happen again" he said. Every time a technological advancement creates serious change for citizen empowerment in society, the dominant forces in that society mobilise to crush it. The Catholic church, gatekeepers of the lion's share of human knowledge at the time actively worked to ban the printing press. They succeeded (if you can believe it) in France in 1535. This time, it's the media companies and they're willing to do anything, including associating file sharing with child pornography and terrorism to do it. Falkvinge's Pirate party is becoming the beachhead in the fight for copyright reform. Now the party with the largest youth delegation (30%!) in Sweden, they are working to get the crucial 4% of the seats in Parliament they need to hold the balance of power and they need your help. He'd like you to send the party 5€ or 10€ per month and I'm already on board.

Angie Byron: Keynote - Women in Open Source

Those of you who know me, know that I can get pretty hostile when it comes to treating women like a special class of people (be the light positive or negative) so I was somewhat skeptical about this one. Thankfully, I was happy to hear Byron cover a number of issues with the Free software community ranging from blatant sexism (CouchDB guys... seriously?) to basic barriers to entry for anyone new to a project. There were a lot of really helpful recommendations to people wanting to engage 100% of the community rather than just one half or the other.

Blake Mizerany: Sinatra

Sinatra is a Ruby framework that went in the opposite direction of things like my beloved Django or Ruby's Rails. Rather than hide the nuts and bolts of HTTP from the developer, Sinatra puts it right out there for you. Where traditional frameworks tend to muddle GET, POST, PUT, and DELETE into one input stream, this framework structures your whole program into blocks a lot like this:

  require 'rubygems'
  require 'sinatra'
  get '/hi' do
    "Hello World!"
  end

That little snipped up there handles the routing and display for a simple Hello World program. Sinatra's strength is that it's simple and elegant. It lets you get at the real power at the heart of HTTP which is really handy, but from what I could tell in the presentation, there's not a lot available outside of that. Database management is done separately, no ORM layer etc. etc. It's very good for what it does, but not at everything, which (at least in my book) makes it awesome.

Ben Galbraith and Dion Almaer: Mozilla Labs

These are the guys who make the Cool New Stuff that comes out of Mozilla. You know those guys, they write a nifty web browser called "Firefox", I'm sure you've heard of them.

Mozilla Labs is where the smart nerds get together to build and experiment with toys that will (hopefully) eventually make it into a finished product. Sometimes that product is an add-on or plug-in, other times it's an entirely new project. It's all about how useful something is to the public. And as always, the code is Free. You may have even heard of Ubiquity, an extension to Firefox that promises to reshape how we use a web browser... they're working on that.

This time through, they were demoing Bespin, a code editor in your web browser. Imagine opening a web browser, going to a page and doing your development there: no need for a local environment, but without the usual disadvantages of aggravating lag or difficult, text-only interface. Now imagine that you can share that development space with someone else in real time and that you can be doing this from your mobile device on a beach somewhere. Yeah, it's that awesome.

We watched as they demoed the crazy power that is the <canvas /> tag by creating a simple text editor, in Javascript right there in front of us... with about 15 lines of code. Really, really impressive.

David Ascher: Open Messaging on the Open Internet

Ascher's talk on Open Messaging was something I was really interested in since I've been actively searching for information on federated social networking for a while now. The presentation was divided into two parts: half covering the history of email and it's slow deprecation in favour of a number of different technologies as well as how people are using it in ways never intended for the architecture. Major problems with the protocol itself were touched on, as well as an explanation about how some of the alternatives out there are also flawed.

He then went on to talk about Mozilla Thunderbird 3 and the variety of cool stuff that's happening with it. "Your mail client knows a lot about you" he says "but until now, we haven't really done a lot with it". Some of the new features for Thunderbird 3 include conversation tracking (like you see in Gmail), helping you keep track of what kinds of email you spend the most time on, who you communicate with most etc. and even statistical charts about what time of day you use mail, what kind of mail you send and to whom how often. It's very neat stuff. Add to this the fact that they've completely rewritten the plug-in support, so new extensions to Thunderbird mean that your mail client will be as useful as you want it to be.

Evan Prodromou: Open Source Microblogging with Laconica

Up until this talk (and with the exception of Falkvinge's keynote), I'd been interested, but not excited about OpenWeb. Prodromou's coverage of Laconica changed all of that.

Founder of WikiTravel and one of the developers on WikiMedia (the software behind Wikipedia), Prodromou has built a federated microblogging platform called Laconica. Think Twitter, but with the ability for an individual to retain ownership of his/her posts and even handle distribution -- with little or no need for technical knowledge required. Here, I made you a diagram to explain:

Federated Laconica vs. Monolithic Twitter
Federated Laconica vs. Monolithic Twitter

Here's how it is: whereas Twitter is a single central source of information, controlled by a single entity (in this case, a corporation), Laconica distributes the load to any number of separate servers owned by different people that all know how to communicate. Where you might be on a server in Toronto, hosted by NetFirms, I could be using a Laconica service hosted by Dreamhost in Honolulu. My posts go to my server, yours go to yours, and when my Twitter client wants to fetch your posts, it talks to NetFirms and vice versa.

The advantages are clear:

  1. Infinite scalability: Twitter's monolithic model necessitates the need for crazy amounts of funding and they still don't have a profit model to account for those costs. Laconica on the other hand means that the load is distributed across potentially millions of hosts (much like the rest of the web).
  2. You control your identity, not a private corporation.

The future is where it gets really exciting though. By retaining ownership of your identity and data, you can start to attach a variety of other data types to the protocol. For the moment, Laconica only supports twitter-like messages, but they're already expanding into file-sharing as well. You'll be able to attach images, video and music files, upload them to your server and share them with whomever is following you. After that, I expect that they'll expand further to include Flickr-like photo streams, Facebook-like friendships and LiveJournal-like blog posts. These old, expensive monolithic systems are going away. In the future we'll have one identity, in one place, that we control that manages all of the data we want to share with others.

Really, really cool stuff.

I went home that night and signed up as a developer on Laconica. I've downloaded the source and will experiment with it this week before I take on anything on the "to do " list. I intend on focusing on expanding the feature set to include stuff that will deprecate the monolithic models mentioned above... should be fun :-)

Drupal Oops

I closed out the evening with some socialising in the hallway and some ranting about how-very-awesome Laconica was to my coworker Ronn, who showed up late in the day. He wandered off in search of my other colleagues and I followed after finishing a recap with Karen Quinn Fung a fellow transit fan and Free software fan. Unfortunately though, I wasn't really paying attention to where Ronn was going, I just followed out of curiosity. It turns that out I had stumbled into a Drupal social where I was almost immediately asked: "so, how do you use Drupal and how much do you love it?" by the social organiser. James gave me a horrified "what the hell are you doing here" look and searching for words, I said something to the effect of "Um, well, I was pretty much just dropping in here looking for my co-workers... oh here they are! -- I like Drupal because it makes it easy for people to make websites, but I don't really use it because it gets in my way. I prefer simple, elegant solutions and working around something just to get it to work is too aggravating." Considering the company, my response was pretty well received. I backed out quietly at the earliest opportunity :-)

So that was OpenWeb, well half of it anyway. I only got a pass for the Thursday. I can't recommend it enough though. Really interesting talks and really interesting people all over the place. I'll have to make sure that I go again next year.

August 27, 2008 18:06 +0000  |  Free Software Geek Stuff Software Technology 0

I just watched this amazing video on the future of how we'll use the Internet. For the nerdy among you: remember how people are always saying stuff like "this will make it a web service that other people can access for whatever they like"? Well this is the end result:

Such a brilliantly simple observation. These guys are doing a great job.