Planet HantsLUG

August 02, 2024

Debian Bits

Bits from the DPL

Dear Debian community,

this are my bits from DPL written at my last day at another great DebConf.

DebConf attendance

At the beginning of July, there was some discussion with the bursary and content team about sponsoring attendees. The discussion continued at DebConf. I do not have much experience with these discussions. My summary is that while there is an honest attempt to be fair to everyone, it did not seem to work for all, and some critical points for future discussion remained. In any case, I'm thankful to the bursary team for doing such a time-draining and tedious job.

Popular packages not yet on Salsa at all

Otto Kekäläinen did some interesting investigation about Popular packages not yet on Salsa at all. I think I might provide some more up to date list soon by some UDD query which considers more recent uploads than the trends data soon. For instance wget was meanwhile moved to Salsa (thanks to Noël Köthe for this).

Keep on contacting more teams

I kept on contacting teams in July. Despite I managed to contact way less teams than I was hoping I was able to present some conclusions in the Debian Teams exchange BoF and Slide 16/23 of my Bits from the DPL talk. I intend to do further contacts next months.

Nominating Jeremy Bícha for GNOME Advisory Board

I've nominated Jeremy Bícha to GNOME Advisory Board. Jeremy has volunteered to represent Debian at GUADEC in Denver.

DebCamp / DebConf

I attended DebCamp starting from 22 July evening and had a lot of fun with other attendees. As always DebConf is some important event nearly every year for me. I enjoyed Korean food, Korean bath, nature at the costline and other things.

I had a small event without video coverage Creating web galleries including maps from a geo-tagged photo collection. At least two attendees of this workshop confirmed success in creating their own web galleries.

I used DebCamp and DebConf for several discussions. My main focus was on discussions with FTP master team members Luke Faraone, Sean Whitton, and Utkarsh Gupta. I'm really happy that the four of us absolutely agree on some proposed changes to the structure of the FTP master team, as well as changes that might be fruitful for the work of the FTP master team itself and for Debian developers regarding the processing of new packages.

My explicit thanks go to Luke Faraone, who gave a great introduction to FTP master work in their BoF. It was very instructive for the attending developers to understand how the FTP master team checks licenses and copyright and what workflow is used for accepting new packages.

In the first days of DebConf, I talked to representatives of DebConf platinum sponsor WindRiver, who announced the derivative eLxr. I warmly welcome this new derivative and look forward to some great cooperation. I also talked to the representative of our gold sponsor, Microsoft.

My first own event was the Debian Med BoF. I'd like to repeat that it might not only be interesting for people working in medicine and microbiology but always contains some hints how to work together in a team.

As said above I was trying to summarise some first results of my team contacts and got some further input from other teams in the Debian Teams exchange BoF.

Finally, I had my Bits from DPL talk. I received positive responses from attendees as well as from remote participants, which makes me quite happy. For those who were not able to join the events on-site or remotely, the videos of all events will be available on the DebConf site soon. I'd like to repeat the explicit need for some volunteers to join the Lintian team. I'd also like to point out the "Tiny tasks" initiative I'd like to start (see below).

BTW, if someone might happen to solve my quiz for the background images there is a summary page in my slides which might help to assign every slide to some DebConf. I could assume that if you pool your knowledge you can solve more than just the simple ones. Just let me know if you have some solution. You can add numbers to the rows and letters to the columns and send me:

 2000/2001:  Uv + Wx
 2002: not attended
 2003: Yz
 2004: not attended
 2005:
 2006: not attended
 2007:
 ...
 2024: A1

This list provides some additional information for DebConfs I did not attend and when no video stream was available. It also reminds you about the one I uncovered this year and that I used two images from 2001 since I did not have one from 2000. Have fun reassembling good memories.

Tiny tasks: Bug of the day

As I mentioned in my Bits from DPL talk, I'd like to start a "Tiny tasks" effort within Debian. The first type of tasks will be the Bug of the day initiative. For those who would like to join, please join the corresponding Matrix channel. I'm curious to see how this might work out and am eager to gain some initial experiences with newcomers. I won't be available until next Monday, as I'll start traveling soon and have a family event (which is why I need to leave DebConf today after the formal dinner).

Kind regards from DebConf in Busan Andreas.

by Andreas Tille at August 02, 2024 05:00 PM

July 28, 2024

Alan Pope

Application Screenshots on macOS

I initially started typing this as short -[ Contrafibularities ]- segment for my free, weekly newsletter. But it got a bit long, so I broke it out into a blog post instead.

About that newsletter

The newsletter is emailed every Friday - subscribe here, and is archived and available via RSS a few days later. I talked a bit about the process of setting up the newsletter on episode 34 of Linux Matters Podcast. Have a listen if you’re interested.

Linux Matters 34

Patreon supporters of Linux Matters can get the show a day or so early, and without adverts. �

Going live!

I have a work-supplied M3 MacBook Pro. It’s a lovely device with ludicrous battery endurance, great screen, keyboard and decent connectivity. As an ex-Windows user at work, and predominantly Linux enthusiast at home, macOS throws curveballs at me on a weekly basis. This week, screenshots.

I scripted a ‘going live’ shell script for my personal live streams. For the title card, I wanted the script to take a screenshot of the running terminal, Alacritty. I went looking for ways to do this on the command line, and learned that macOS has shipped a screencapture command-line tool for some time now. Amusingly the man page for it says:

DESCRIPTION
 The screencapture utility is not very well documented to date.
 A list of options follows.

and..

BUGS
 Better documentation is needed for this utility.

This is 100% correct.

How hard can it be?

Perhaps I’m too used to scrot on X11, that I have used for over 20 years. If I want a screenshot of the current running system, just run scrot and bang there’s a PNG in the current directory showing what’s on screen. Easy peasy.

On macOS, run screencapture image.png and you’ll get a screenshot alright, of the desktop, your wallpaper. Not the application windows carefully arranged on top. To me, this is somewhat obtuse. However, it is also possible to screenshot a window, if you know the <windowid>.

From the screencapture man page:

 -l <windowid> Captures the window with windowid.

There appears to be no straightforward way to actually get the <windowid> on macOS, though. So, to discover the <windowid> you might want the GetWindowID utility from smokris (easily installed using Homebrew).

That’s fine and relatively straightforward if there’s only one Window of the application, but a tiny bit more complex if the app reports multiple windows - even when there’s only one. Alacritty announces multiple windows, for some reason.

$ GetWindowID Alacritty --list
"" size=500x500 id=73843
"(null)" size=0x0 id=73842
"alan@Alans-MacBook-Pro.local (192.168.1.170) - byobu" size=1728x1080 id=73841

FINE. We can deal with that:

$ GetWindowID Alacritty --list | grep byobu | awk -F '=' '{print $3}'
73841

You may then encounter the mysterious could not create image from window error. This threw me off a little, initially. Thankfully I’m far from the first to encounter this.

Big thanks to this rancher-sandbox, rancher-desktop pull request against their screenshot docs. Through that I discovered there’s a macOS security permission I had to enable, for the terminal application to be able to initiate screenshots of itself.

A big thank-you to both of the above projects for making their knowledge available. Now I have this monstrosity in my script, to take a screenshot of the running Alacritty window:

screencapture -l$(GetWindowID Alacritty --list | \
 grep byobu | \
 awk -F '=' '{print $3}') titlecard.png

If you watch any of my live streams, you may notice the title card. Now you know how it’s made, or at least how the screenshot is created, anyway.

July 28, 2024 10:28 AM

July 27, 2024

Debian Bits

DebConf24 starts today in Busan on Sunday, July 28, 2024

DebConf24, the 25th annual Debian Developer Conference, is taking place in Busan, Republic of Korea from July 28th to August 4th, 2024. Debian contributors from all over the world have come together at Pukyong National University, Busan, to participate and work in a conference exclusively ran by volunteers.

Today the main conference starts with around 340 expected attendants and over 100 scheduled activities, including 45-minute and 20-minute talks, Bird of a Feather ("BoF") team meetings, workshops, a job fair, as well as a variety of other events. The full schedule is updated each day, including activities planned ad-hoc by attendees over the course of the conference.

If you would like to engage remotely, you can follow the video streams available from the DebConf24 website for the events happening in the three talk rooms: Bada, Somin and Pado. Or you can join the conversations happening inside the talk rooms via the OFTC IRC network in the #debconf-bada, #debconf-somin, and #debconf-pado channels. Please also join us in the #debconf channel for common discussions related to DebConf.

You can also follow the live coverage of news about DebConf24 provided by our micronews service or the @debian profile on your favorite social network.

DebConf is committed to a safe and welcoming environment for all participants. Please see our Code of Conduct page for more information on this.

Debian thanks the commitment of numerous sponsors to support DebConf24, particularly our Platinum Sponsors: Proxmox, Infomaniak and Wind River.

DebConf24 logo

by The Debian Publicity Team at July 27, 2024 09:50 PM

DebConf24 welcomes its sponsors!

DebConf24 logo

DebConf24, the 25th edition of the Debian conference is taking place in Pukyong National University at Busan, Republic of Korea. Thanks to the hard work of its organizers, it again will be an interesting and fruitful event for attendees.

We would like to warmly welcome the sponsors of DebConf24, and introduce them to you.

We have three Platinum sponsors.

  • Proxmox is the first Platinum sponsor. Proxmox provides powerful and user-friendly Open Source server software. Enterprises of all sizes and industries use Proxmox solutions to deploy efficient and simplified IT infrastructures, minimize total cost of ownership, and avoid vendor lock-in. Proxmox also offers commercial support, training services, and an extensive partner ecosystem to ensure business continuity for its customers. Proxmox Server Solutions GmbH was established in 2005 and is headquartered in Vienna, Austria. Proxmox builds its product offerings on top of the Debian operating system.

  • Our second Platinum sponsor is Infomaniak. Infomaniak is an independent cloud service provider recognised throughout Europe for its commitment to privacy, the local economy and the environment. Recording growth of 18% in 2023, the company is developing a suite of online collaborative tools and cloud hosting, streaming, marketing and events solutions. Infomaniak uses exclusively renewable energy, builds its own data centers and develops its solutions in Switzerland, without relocating. The company powers the website of the Belgian radio and TV service (RTBF) and provides streaming for more than 3,000 TV and radio stations in Europe.

  • Wind River is our third Platinum sponsor. For nearly 20 years, Wind River has led in commercial Open Source Linux solutions for mission-critical enterprise edge computing. With expertise across aerospace, automotive, industrial, telecom, and more, the company is committed to Open Source through initiatives like eLxr, Yocto, Zephyr, and StarlingX.

Our Gold sponsors are:

  • Ubuntu, the Operating System delivered by Canonical.

  • Freexian, a services company specialized in Free Software and in particular Debian GNU/Linux, covering consulting, custom developments, support and training. Freexian has a recognized Debian expertise thanks to the participation of Debian developers.

  • Lenovo, a global technology leader manufacturing a wide portfolio of connected products including smartphones, tablets, PCs and workstations as well as AR/VR devices, smart home/office and data center solutions.

  • Korea Tourism Organization, which purpose is to advance tourism as a key driver for national economic growth and enhancement of national welfare and intends to be a public organization that makes the Korean people happier; it promotes national wealth through tourism.

  • Busan IT Industry Promotion Agency, an industry promotion organization that contributes to the innovation of the digital economy with the power of IT and CT and supports the ecosystem for innovative local startups and companies to grow.

  • Microsoft, who enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

  • doubleO, a company that specializes in consulting and developing empirical services using big data analysis and artificial intelligence. doubleO provides a variety of data-centered services together with small and medium-sized businesses in Busan/Gyeongnam.

Our Silver sponsors are:

  • Roche, a major international pharmaceutical provider and research company dedicated to personalized healthcare.
  • Two Sigma, rigorous inquiry, technology data science, and invention to bring science to finance and help solve the toughest challenges across financial services.
  • Arm: leading technology provider of processor IP, Arm powered solutions have been supporting innovation for more than 30 years and are deployed in over 280 billion chips to date.
  • The Bern University of Applied Sciences with around 7,800 students enrolled, located in the Swiss capital.
  • Google, one of the largest technology companies in the world, providing a wide range of Internet-related services and products such as online advertising technologies, search, cloud computing, software, and hardware.
  • FSIJ, the Free Software Initiative of Japan, a non-profit organization dedicated to supporting Free Software growth and development.
  • Busan Tourism Organisation: leading public corporation that generates social and economic values in Busan tourism industry, developing tourism resources in accordance with government policies and invigorate tourism industry.
  • Civil Infrastructure Platform, a collaborative project hosted by the Linux Foundation, establishing an open source “base layer” of industrial grade software.
  • Collabora, a global consultancy delivering Open Source software solutions to the commercial world.
  • Matanel Foundation, which operates in Israel, as its first concern is to preserve the cohesion of a society and a nation plagued by divisions.

Bronze sponsors:

And finally, our Supporter level sponsors:

A special thanks to the Pukyong National University, our Venue Partner and our Network Partners KOREN and KREONET!

Thanks to all our sponsors for their support! Their contributions make it possible for a large number of Debian contributors from all over the globe to work together, help and learn from each other in DebConf24.

by The Debian Publicity Team at July 27, 2024 09:45 PM

July 25, 2024

Andy Smith

Daniel Kitson – Collaborator (work in progress)

Collaborators

Last night we went to see Daniel Kitson's "Collaborator" (work in progress). I'd no idea what to expect but it was really good!

A photo of the central area of a small theatre in the round. There are          four tiers of seating and then an upper balcony.Most seats are filled.          The central stage area is empty except for four large stacks of          paper.
The in-the-round setup of Collaborator at The Albany Theatre, Deptford, London

It has been reviewed at 4/5 stars in Chortle and positively in the Guardian, but I don't recommend reading any reviews because they'll spoil what you will experience. We went in to it blind as I always prefer that rather than thorough research of a show. I think that was the correct decision. I've been on Daniel's fan newsletter for ages but hadn't had chance to see him live until now.

While I've seen some comedy gigs that resembled this, I've never seen anything quite like it.

At £12 a ticket this is an absolute bargain. We spent more getting there by public transport!

Shout out to the nerds

If you're a casual comedy enjoyer looking for something a bit different then that's all you need to know. If like me however you consider yourself a bit of a wanky appreciator of comedy as an art form, I have some additional thoughts!

Collaborator wasn't rolling-on-the-floor-in-tears funny, but was extremely enjoyable and Jenny and I spent the whole way home debating how Kitson designed it and what parts of it really meant. Not everyone wants that in comedy, and that's fine. I don't always want it either. But to get it sometimes is a rare treat.

It's okay to enjoy a McIntyre or Peter Kay crowd-pleaser about "do you have a kitchen drawer full of junk?" or "do you remember white dog poo?" but it's also okay to appreciate something that's very meta and deconstructive. Stewart Lee for example is often accused of being smug and arrogant when he critiques the work of other comedians, and his fans to some extent are also accused of enjoying feeling superior more than they enjoy a laugh - and some of them who miss the point definitely are like this.

But acts like Kitson and Lee are constructed personalities where what they claim to think and how they behave is a fundamental part of the performance. You are to some extent supposed to disagree with and be challenged by their views and behaviours — and I don't just mean they are edgelording with some "saying the things that can't be said" schtick. Sometimes it's fun to actually have thoughts about it. It's a different but no less valid (or more valid!) experience. A welcome one in this case!

I mean, I might have to judge you if you enjoy Mrs Brown's Boys, but I accept it has an audience as an art form.

White space

There was a comment on Fedi about how the crowd pictured here appears to be a sea of white faces, despite London being a fairly diverse city. This sort of thing hasn't escaped me. I've found it to be the case in most of the comedy gigs I've attended in person, where the performer is white. I don't know why. In fact, acts like Stewart Lee and Richard Herring will frequently make reference to the fact that their stereotypical audience member is a middle aged white male computer toucher with lefty London sensibilities. So, me then.

Don't get me wrong, I do try to see some diverse acts and have been in a demographic minority a few times. Sadly enough, just going to see a female act can be enough to put you in an audience of mostly women. That happened when we went to see Bridget Christie's Who Am I? ("a menopause laugh a minute with a confused, furious, sweaty woman who is annoyed by everything", 4 stars, Chortle), and it's a shame that people seem to stick in their lanes so much.

References

by Andy at July 25, 2024 12:00 AM

July 09, 2024

Steve Kemp

The CP/M emulator is good enough, I think.

My previous post mentioned that I'd added some custom syscalls to my CP/M emulator and that lead to some more updates, embedding a bunch of binaries within the emulator so that the settings can be tweaked at run-time, for example running:

!DEBUG 1
!CTRLC 1
!CCP ccpz
!CONSOLE adm-3a

Those embedded binaries show up on A: even if they're not in the pwd when you launch the emulator.

Other than the custom syscalls I've updated the real BDOS/BIOS syscalls a bit more, so that now I can run things like the Small C compiler, BBC BASIC, and more. (BBCBasic.com used to launch just fine, but it turned out that the SAVE/LOAD functions didn't work. Ooops!)

I think I've now reached a point where all the binaries I care about run, and barring issues I will slow down/stop development. I can run Turbo Pascal, WordStar, various BASIC interpreters, and I have a significantly improved understanding of how CP/M works - a key milestone in that understanding was getting SUBMIT.COM to execute, and understanding the split between the BDOS and the BIOS.

I'd kinda like to port CP/M to a new (Z80-based) system - but I don't have such a thing to hand, and I guess there's no real need for it. Perhaps I can say I'm "done" with retro stuff, and go back to playing Super Mario Bros (1985) with my boy!

July 09, 2024 08:00 PM

The CP/M emulator is good enough

My previous post mentioned that I'd added some custom syscalls to my CP/M emulator and that lead to some more updates, embedding a bunch of binaries within the emulator so that the settings can be tweaked at run-time, for example running:

!DEBUG 1
!CTRLC 1
!CCP ccpz
!CONSOLE adm-3a

Those embedded binaries show up on A: even if they're not in the pwd when you launch the emulator.

Other than the custom syscalls I've updated the real BDOS/BIOS syscalls a bit more, so that now I can run things like the Small C compiler, BBC BASIC, and more. (BBCBasic.com used to launch just fine, but it turned out that the SAVE/LOAD functions didn't work. Ooops!)

I think I've now reached a point where all the binaries I care about run, and barring issues I will slow down/stop development. I can run Turbo Pascal, WordStar, various BASIC interpreters, and I have a significantly improved understanding of how CP/M works - a key milestone in that understanding was getting SUBMIT.COM to execute, and understanding the split between the BDOS and the BIOS.

I'd kinda like to port CP/M to a new (Z80-based) system - but I don't have such a thing to hand, and I guess there's no real need for it. Perhaps I can say I'm "done" with retro stuff, and go back to playing Super Mario Bros (1985) with my boy!

July 09, 2024 08:00 PM

June 30, 2024

Andy Smith

Including remote data in a MediaWiki article

A few months ago I needed to include some data — that was generated and held remotely — into a MediaWiki article.

Here's the solution I chose which enabled me to generate some tables populated with data that only exists in some remote YAML files:

Screenshot of a wiki article that describes three different contact         methods; a mailing list' an IRC chat room and a Telegram group. Beside         each method is a table of their activity. The mailing list shows 10         messages in the last 30 days. The IRC channel shows 25 messages in the         last 30 days. The Telegram group shows 91 messages in the last 30         days.
Screenshot of the Community article showing tables of activity stats
INFO

I did actually do all this back in early April, but as I couldn't read my own blog site at the time I had to set up a new blog before I could write about it! 😀

Background

All the way back in March 2024 I'd decided that BitFolk probably should have some alternative chat venue to its IRC channel, which had been largely silent for quite some time. So, I'd opened a Telegram group and spruced up the Community article on BitFolk's wiki.

When writing about the new thing in the article I got to thinking how I feel when I see a project with a bunch of different contact methods listed.

I'm usually glad to see that a project has ways to contact them that I don't consider awful, but if all the ones that I consider non-awful are actually deserted, barren and disused then I'd like to be able to decide whether I would actually want to hold my nose and go to a Discord some service I ordinarily would dislike.

So, it's not just that these things exist — easy to just list off — but I decided I would like to also include some information about how active these things are (or not).

The problem

BitFolk's wiki is a MediaWiki site, so including any sort of dynamic content that isn't already implemented in the software would require code changes or an extension.

The one solution that doesn't involve developing something or using an existing extension would be to put a HTML <iframe> in a template that's set to allow raw HTML. <iframe>s aren't normally allowed in general articles due to the havoc they could cause with a population of untrusted authors, but putting them in templates would be okay since the content they would include could be locked down that way.

The appearance of such a thing though is just not very nice without a lot of styling work. That's basically a web site inside a web site. I had the hunch that there would be existing extensions for including structured remote data. And there is!

External_Data extension

The extension I settled on is called External_Data.

Description

Allows for using and displaying values retrieved from various sources: external URLs and SOAP services, local wiki pages and local files (in CSV, JSON, XML and other formats), database tables, LDAP servers and local programs output.

Just what I was looking for!

While this extension can just include plain text, there are other, simpler extebnsions I could have used if I just wanted to do that. You see, each of the sets of activity stats will have to be generated by a program specific to each service; counting mailing list posts is not like counting IRC messages, and so on.

I wanted to write programs that would store this information in a structured format like YAML and then External_Data would be used to turn each of those remote YAML files into a table.

Example YAML data

I structured the output of my programs like this:

---
bitfolk:
  messages_last_30day: 91
  messages_last_6hour: 0
  messages_last_day: 0
stats_at: 2024-06-29 21:02:03

Markup in the wiki article

In the wiki article that is formatted like this:

{{#get_web_data:url=https://ruminant.bitfolk.com/social-stats/tg.yaml
|format=yaml
|data=bftg_stats_at=stats_at,bftg_last_6hour=messages_last_6hour,bftg_last_day=messages_last_day,bftg_last_30day=messages_last_30day
}}

{| class="wikitable" style="float:right; width:25em; margin:1em"
|+ Usage stats as of {{#external_value:bftg_stats_at}} GMT
|-
!colspan="3" | Messages in the last…
|-
! 6 hours || 24 hours || 30 days
|- style="text-align:center"
| {{#external_value:bftg_last_6hour}}
| {{#external_value:bftg_last_day}}
| {{#external_value:bftg_last_30day}}
|}

How it works

  1. Data is requested from a remote URL (https://ruminant.bitfolk.com/social-stats/tg.yaml).
  2. It's parsed as YAML.
  3. Variables from the YAML are stored in variables in the article, e.g. bftg_stats_at is set to the value of stats_at from the YAML.
  4. A table in wiki syntax is made and the data inserted in to it with directives like {{#external_value:bftg_stats_at}}.

This could obviously be made cleaner by putting all the wiki markup in a template and just calling that with the variables.

Wrinkle: MediaWiki's caching

MediaWiki caches quite aggressively, which makes a lot of sense: it's expensive for some PHP to request wiki markup out of a database and convert it into HTML every time when it almost certainly hasn't changed since the last time someone looked at it. But that frustrates what I'm trying to do here. The remote data does update and MediaWiki doesn't know about that!

In theory it looks like it is possible to adjust cache times per article (or even per remote URI) but I didn't have much success getting that to work. It is possible to force an article's cache to be purged with just a POST request though, so I solved the problem by having each of my activity summarising programs issue such a request when their job is done. This will do it:

curl -s -X POST 'https://tools.bitfolk.com/wiki/Community?action=purge'

They only run once an hour anyway, so it's not a big deal.

Concerns?

Isn't it dangerous to allow article authors to include arbitrary remote data?

Yes! The main wiki configuration can have a section added which sets an allowlist of domains or even URI prefixes for what is allowed to be included.

What if the remote data becomes unavailable?

The extension has settings for how stale data can be before it's rejected. In this case it's a trivial use so it doesn't really matter, of course.

by Andy at June 30, 2024 12:00 AM

June 22, 2024

Andy Smith

Thoughts on commenting facilities for this site

This site, being a static one, presents some challenges with regard to accepting comments from its readers. There's also a bunch of comments that already exist on the legacy site. I have some thoughts about what I should do about this.

TL;DR

I think I will:

  • Try to set up Isso at least for the purpose of importing old comments.
  • Then I'll see what Isso is like generally.
  • If Isso didn't work for importing then I'll try the XML conversion myself.
  • Independently, I'll investigate the Fediverse conversation thing.

Updates

2024-06-24

Isso was implemented and comments from the old Wordpress blog were imported into it. I'm still unsure if I will continue to allow new comments through it though.

The problem

It's a bit tricky to accept comments onto a web site that is running off of static files. JavaScript is basically the only way to do it, for which there are a number of options.

There's a couple of hundred comments on the 300 or so articles that exist on the legacy Wordpress site as well, and at least some of them I think are worth moving over when I get around to moving over an article from there.

Is it really worth having comments?

I mean, the benefits are slim, but they definitely do exist. I've had a few really useful and interesting comments over the years and it would be a shame to do away with that feature even if it does make life a lot easier.

So, conclusions:

  • I should find a way to bring over some if not all of the comments that already exist.
  • I should provide at least one way1 to let people comment on new articles.

Other people's value judgements can and will differ. A lot of this is just going to be, like, my opinion, man.

In that case, what to do about…

Existing comments

I've got an XML export of the legacy blog which includes all the comment data along with the post data. The Wordpress-to-Markdown conversion program that I've used only converts the post body, though, so at the moment none of the articles I've migrated have had their comments brought along with them.

I think it will be enough to also add the existing comment data as static HTML. I don't think there's any real need to make it possible for people to interact with past comments. There's some personal information that commenters may have provided, like what they want to be known as and their web site if any. There has to be a means for that to be deleted upon request, but I think it will be okay to expect such requests to come in by email.

After a casual search I haven't managed to find existing software that will convert the comments in a Wordpress XML export into Markdown or static HTML. I might have missed something though because the search results are filled with a plethora of Wordpress plugins for static site export. One of those might actually be suitable. If you happen to know of something that may be suitable please let me know! I guess that would have to be by email or Fediverse right now (links at the bottom of the page).

It is claimed, however, that Isso (see below) can import comments from a Wordpress XML export!

The comments XML

Comments in the XML export look like this (omitting some uninteresting fields):

<item>
  <wp:post_id>16</wp:post_id>
  <!-- more stuff about the article in here -->
  <wp:comment>
    <wp:comment_id>119645</wp:comment_id>
    <wp:comment_parent>0</wp:comment_parent>
    <wp:comment_author><![CDATA[Greyhound lover]]></wp:comment_author>
    <wp:comment_date><![CDATA[2009-07-08 10:35:14]]></wp:comment_date>
    <wp:comment_content><![CDATA[What a nicely reasoned and well informed article.

The message about Greyhound rescue is getting through, but far too slowly.

I hope your post gets a lot of traffic.

Ray]]></wp:comment_content>
  </wp:comment>
</item>

If I can't find existing software to do it, I think my skills do stretch to using XSLT or something to transform the list of <wp:comment></wp:comment>s into Markdown or HTML for simple inclusion.

Wordpress does comment threading by having the <wp:comment_parent> be non-zero. I think that would be nice to replicate but if my skills end up not being up to it then it will be okay to just have a flat chronological list. I'll keep the data to leave the door open to improving it in future.

I haven't decided yet if it will be more important to bring over old comments, or to figure out a solution for new comments.

Future comments

All of the options for adding comments to a static site involve JavaScript. Whatever I choose to do, people who want to keep JS disabled are not going to be able to add comments and will just have to make do with email.

I'm aware of a few different options in this space.

Disqus

Just no. Surveillance capitalism.

giscus

giscus stores comments in GitHub discussions. It's got a nice user interface since the UI of GitHub itself is pretty fancy, but does mean that every commenter will require a GitHub account and anonymous comments aren't possible.

There's also utterances which stores things in GitHub issues, but has fewer features than giscuss and the same major downsides.

I am Not A Fan of requiring people to use GitHub.

Hyvor Talk

Hyvor Talk is a closed source paid service that's a bit fancier than giscus.

I'm still not particularly a fan of making people log in to some third party service.

Isso

Isso is a self-hosted open source service that's got quite a nice user interface, permits things like Markdown in comments, and optionally allows anonymous comments so that commenters don't need to maintain an account if they don't want to.

I think this one is a real contender!

Mastodon API

This isn't quite a commenting system, since it doesn't involve directly posting comments.

The idea is that each article has an associated toot ID which is the identifier for a post on a Mastodon server. The Mastodon API is then used to display all Fediverse replies to that post. So:

  1. You post on your Mastodon server about the article.
  2. You take the toot ID of that post and set it in a variable in the article's front matter.
  3. JavaScript on your site is then able to display all the comments on that Fediverse post.
NOTE

In this section I talk about "Fediverse" and "Mastodon".

I'm not an expert on this but my understanding is that Fediverse instances exchange data using the ActivityPub protocol, and Mastodon is a particular implementation of a Fediverse instance.

However, Mastodon's API is unique to itself (and derivative software), so this commenting system would rely on the article author having an account on a Mastodon server. Though, everyone else replying on the Fediverse would not necessarily be using Mastodon on their instances yet their replies would still show up.

The effect is that a Fediverse conversation about your article is placed on your article. This project is an example of such a thing.

Of course, not everyone has a Fediverse account and not everyone wants one, but at least anyone potentially can, without having to deal with some central third party. And if no existing Fediverse instance suits them then they can set up their own. It's a decentralised solution.

The extremely niche nature of the Fediverse is pretty stark:

Active users
Fediverse

~1 million as of June 2024.

GitHub

~100 million as of January 2023 (unclear how "active" is defined).

Fediverse comments are also basically just plain text and links. Unfortunately no way to express yourself better with Markdown or other styling.

Nevertheless, I like it. I think I want to pursue this. Maybe in combination with Isso, if that doesn't get too noisy.


1

"Email" doesn't count!

by Andy at June 22, 2024 12:00 AM

May 25, 2024

Steve Kemp

The CP/M emulator is working well

In my recent posts I've talked about implementing BDOS and BIOS syscalls for my cp/m emulator. I've now implemented enough of the calls that I can run many of the standard binaries:

  • The Aztech C Compiler
  • Microsoft BASIC
  • Turbo Pascal
  • Wordstar
  • etc

Of course I've not implemented all the syscalls, so the emulation isn't 100% perfect and many binaries won't run. But I sent myself on a detour by implementing extra syscalls, custom syscalls.

Traditionally CP/M systems are "rebooted" by pressing Ctrl-C at the CCP prompt. I thought that was something I'd press by accident so I implemented the restart behaviour only when the user pressed Ctrl-C twice in a row. But then I added a custom syscall that lets you change hte value:

A>ctrlc
The Ctrl-C count is currently set to 2
A>ctrlc 1
The Ctrl-C count is currently set to 1
A>

So you can now change the value at runtime. Similarly there is support for switching CCP at runtime, and even changing the default output-device from ADM-3A to ANSI, or vice-versa. It's kinda neat to make these kind of extensions, and happily the traditional BIOS has two syscalls reserved for custom use so I just used one of those.

I've added support for testing whether a binary is running under my emulator, or not, using a custom syscall. So I can run:

A>test
This binary is running under cpmulator:

cpmulator unreleased
https://github.com/skx/cpmulator/

On another emulator I see this:

A>test
Illegal BIOS call 31
No, this binary is not running under cpmulator.

Anyway I'm happy with the current state of things, and I fixed a couple of bugs which means I now have support for SUBMIT.COM which is a real time-saver.

May 25, 2024 12:00 PM

April 27, 2024

Alan Pope

The Joy of Code

A few weeks ago, in episode 25 of Linux Matters Podcast I brought up the subject of ‘Coding Joy’. This blog post is an expanded follow-up to that segment. Go and listen to that episode - or not - it’s all covered here.

The Joy of Linux Torture

Not a Developer

I’ve said this many times - I’ve never considered myself a ‘Developer’. It’s not so much imposter syndrome, but plain facts. I didn’t attend university to study software engineering, and have never held a job with ‘Engineer’ or Developer’ in the title.

(I do have Engineering Manager and Developer Advocate roles in my past, but in popey’s weird set of rules, those don’t count.)

I have written code over the years. Starting with BASIC on the Sinclair ZX81 and Sinclair Spectrum, I wrote stuff for fun and no financial gain. I also coded in Z80 & 6502 assembler, taught myself Pascal on my Epson 8086 PC in 1990, then QuickBasic and years later, BlitzBasic, Lua (via LÖVE) and more.

In the workplace, I wrote some alarmingly complex utilities in Windows batch scripts and later Bash shell scripts on Linux. In a past career, I would write ABAP in SAP - which turned into an internal product mildly amusingly called “Alan’s Tool”.

These were pretty much all coding for fun, though. Nobody specced up a project and assigned me as a developer on it. I just picked up the tools and started making something, whether that was a sprite routine in Z80 assembler, an educational CPU simulator in Pascal, or a spreadsheet uploader for SAP BiW.

In 2003, three years before Twitter launched in 2006, I made a service called ‘Clunky.net’. It was a bunch of PHP and Perl smashed together and published online with little regard for longevity or security. Users could sign up and send ’tweet’ style messages from their phone via SMS, which would be presented in a reverse-chronological timeline. It didn’t last, but I had fun making it while it did.

They were all fun side-quests.

None of this makes me a developer.

Volatile Memories

It’s rapidly approaching fifty years since I first wrote any code on my first computer. Back then, you’d typically write code and then either save it on tape (if you were patient) or disk (if you were loaded). Maybe you’d write it down - either before or after you typed it in - or perhaps you’d turn the computer off and lose it all.

When I studied for a BTEC National Diploma in Computer Studies at college, one of our classes was on the IBM PC with two floppy disc drives. The lecturer kept hold of all the floppies because we couldn’t be trusted not to lose, damage or forget them. Sometimes the lecturer was held up at the start of class, so we’d be sat twiddling our thumbs for a bit.

In those days, when you booted the PC with no floppy inserted, it would go directly into BASICA, like the 8-bit microcomputers before it. I would frequently start writing something, anything, to pass the time.

With no floppy disks on hand, the code - beautiful as it was - would be lost. The lecturer often reset the room when they entered, hitting a big red ‘Stop’ button, which instantly powered down all the computers, losing whatever ‘work’ you’d done.

I was probably a little irritated at the moment, just as I would when the RAM pack wobbled on my ZX81, losing everything. You move on, though, and make something else, or get on with your college work, and soon forget about it.

Or you bitterly remember it and write a blog post four decades later. Each to their own.

Sharing is Caring

This part was the main focus of the conversation when we talked about this on the show.

In the modern age, over the last ten to fifteen years or so, I’ve not done so much of the kind of coding I wrote about above. I certainly have done some stuff for work, mostly around packaging other people’s software as snaps or writing noddy little shell scripts. But I lost a lot of the ‘joy’ of coding recently.

Why?

I think a big part is the expectation that I’d make the code available to others. The public scrutiny others give your code may have been a factor. The pressure I felt that I should put my code out and continue to maintain it rather than throw it over the wall wouldn’t have helped.

I think I was so obsessed with doing the ‘right’ thing that coding ‘correctly’ or following standards and making it all maintainable became a cognitive roadblock.

I would start writing something and then begin wondering, ‘How would someone package this up?’ and ‘Am I using modern coding standards, toolkits, and frameworks?’ This held me back from the joy of coding in the first place. I was obsessing too much over other people’s opinions of my code and whether someone else could build and run it.

I never used to care about this stuff for personal projects, and it was a lot more joyful an experience - for me.

I used to have an idea, pick up a text editor and start coding. I missed that.

Realisation

In January this year, Terence Eden wrote about his escapades making a FourSquare-like service using ActivityPub and OpenStreetMap. When he first mentioned this on Mastodon, I grabbed a copy of the code he shared and had a brief look at it.

The code was surprisingly simple, scrappy, kinda working, and written in PHP. I was immediately thrown back twenty years to my terrible ‘Clunky’ code and how much fun it was to throw together.

In February, I bumped into Terence at State of Open Con in London and took the opportunity to quiz him about his creation. We discussed his choice of technology (PHP), and the simple ’thrown together in a day’ nature of the project.

At that point, I had a bit of a light-bulb moment, realising that I could get back to joyful coding. I don’t have to share everything; not every project needs to be an Open-Source Opus.

I can open a text editor, type some code, and enjoy it, and that’s enough.

Joy Rediscovered

I had an idea for a web application and wanted to prototype something without too much technological research or overhead. So I created a folder on my home server, ran php -S 0.0.0.0:9000 in a terminal there, made a skeleton index.php and pointed a browser at the address. Boom! Application created!

I created some horribly insecure and probably unmaintainable PHP that will almost certainly never see the light of day.

I had fun doing it though. Which is really the whole point.

More side-quests, fewer grand plans.

April 27, 2024 08:00 AM

April 26, 2024

Alan Pope

Do you know Simone?

Over coffee this morning, I stumbled upon simone, a fledgling Open-Source tool for repurposing YouTube videos as blog posts. The Python tool creates a text summary of the video and extracts some contextual frames to illustrate the text.

A neat idea! In my experience, software engineers are often tasked with making demonstration videos, but other engineers commonly prefer consuming the written word over watching a video. I took simone for a spin, to see how well it works. Scroll down and tell me what you think!

I was sat in front of my work laptop, which is a mac, so roughly speaking, this is what I did:

  • Install host pre-requisites
$ brew install ffmpeg tesseract virtualenv
git clone https://github.com/rajtilakjee/simone
  • Get a free API key from OpenRouter
  • Put the API key in .env
GEMMA_API_KEY=sk-or-v1-0000000000000000000000000000000000000000000000000000000000000000
  • Install python requisites
$ cd simone
$ virtualenv .venv
$ source .venv/bin/activate
(.venv) $ pip install -r requirements.txt
  • Run it!
(.venv) $ python src/main.py
Enter YouTube URL: https://www.youtube.com/watch?v=VDIAHEoECfM
/Users/alan/Work/rajtilakjee/simone/.venv/lib/python3.12/site-packages/whisper/transcribe.py:115: UserWarning: FP16 is not supported on CPU; using FP32 instead
 warnings.warn("FP16 is not supported on CPU; using FP32 instead")
Traceback (most recent call last):
 File "/Users/alan/Work/rajtilakjee/simone/.venv/lib/python3.12/site-packages/pytesseract/pytesseract.py", line 255, in run_tesseract
 proc = subprocess.Popen(cmd_args, **subprocess_args())
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/opt/homebrew/Cellar/python@3.12/3.12.3/Frameworks/Python.framework/Versions/3.12/lib/python3.12/subprocess.py", line 1026, in __init__
 self._execute_child(args, executable, preexec_fn, close_fds,
 File "/opt/homebrew/Cellar/python@3.12/3.12.3/Frameworks/Python.framework/Versions/3.12/lib/python3.12/subprocess.py", line 1955, in _execute_child
 raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'C:/Program Files/Tesseract-OCR/tesseract.exe'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
 File "/Users/alan/Work/rajtilakjee/simone/src/main.py", line 47, in <module>
 blogpost(url)
 File "/Users/alan/Work/rajtilakjee/simone/src/main.py", line 39, in blogpost
 score = scores.score_frames()
 ^^^^^^^^^^^^^^^^^^^^^
 File "/Users/alan/Work/rajtilakjee/simone/src/utils/scorer.py", line 20, in score_frames
 extracted_text = pytesseract.image_to_string(
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/Users/alan/Work/rajtilakjee/simone/.venv/lib/python3.12/site-packages/pytesseract/pytesseract.py", line 423, in image_to_string
 return {
 ^
 File "/Users/alan/Work/rajtilakjee/simone/.venv/lib/python3.12/site-packages/pytesseract/pytesseract.py", line 426, in <lambda>
 Output.STRING: lambda: run_and_get_output(*args),
 ^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/Users/alan/Work/rajtilakjee/simone/.venv/lib/python3.12/site-packages/pytesseract/pytesseract.py", line 288, in run_and_get_output
 run_tesseract(**kwargs)
 File "/Users/alan/Work/rajtilakjee/simone/.venv/lib/python3.12/site-packages/pytesseract/pytesseract.py", line 260, in run_tesseract
 raise TesseractNotFoundError()
pytesseract.pytesseract.TesseractNotFoundError: C:/Program Files/Tesseract-OCR/tesseract.exe is not installed or it's not in your PATH. See README file for more information.
  • Oof!
  • File a bug (like a good Open Source citizen)
  • Locally patch the file and try again
(.venv) python src/main.py
Enter YouTube URL: https://www.youtube.com/watch?v=VDIAHEoECfM
/Users/alan/Work/rajtilakjee/simone/.venv/lib/python3.12/site-packages/whisper/transcribe.py:115: UserWarning: FP16 is not supported on CPU; using FP32 instead
 warnings.warn("FP16 is not supported on CPU; using FP32 instead")
  • Look for results
(.venv) $ ls -l generated_blogpost.txt *.jpg
-rw-r--r-- 1 alan staff 2163 26 Apr 09:26 generated_blogpost.txt
-rw-r--r--@ 1 alan staff 132984 26 Apr 09:27 top_frame_4_score_106.jpg
-rw-r--r-- 1 alan staff 184705 26 Apr 09:27 top_frame_5_score_105.jpg
-rw-r--r-- 1 alan staff 126148 26 Apr 09:27 top_frame_9_score_101.jpg

In my test I pointed simone at a short demo video from my employer, Anchore’s YouTube channel. The results are below, with no editing, I even included the typos. The images at the bottom of this post are frames from the video that simone selected.


Ancors Static Stick Checker Tool Demo: Evaluating and Resolving Security Findings

Introduction

Static stick checker tool helps developers identify security vulnerabilities in Docker images by running open-source security checks and generating remediation recommendations. This blog post summarizes a live demo of the tool’s capabilities.

How it works

The tool works by:

  • Downloading and analyzing the Docker image.
  • Detecting the base operating system distribution and selecting the appropriate stick profile.
  • Running open-source security checks on the image.
  • Generating a report of identified vulnerabilities and remediation actions.

Demo Walkthrough

The demo showcases the following steps:

  • Image preparation: Uploading a Docker image to a registry.
  • Tool execution: Running the static stick checker tool against the image.
  • Results viewing: Analyzing the generated stick results and identifying vulnerabilities.
  • Remediation: Implementing suggested remediation actions by modifying the Dockerfile.
  • Re-checking: Running the tool again to verify that the fixes have been effective.

Key findings

  • The static stick checker tool identified vulnerabilities in the Docker image in areas such as:
    • Verifying file hash integrity.
    • Configuring cryptography policy.
    • Verifying file permissions.
  • Remediation scripts were provided to address each vulnerability.
  • By implementing the recommended changes, the security posture of the Docker image was improved.

Benefits of using the static stick checker tool

  • Identify security vulnerabilities early in the development process.
  • Automate the remediation process.
  • Shift security checks leftward in the development pipeline.
  • Reduce the burden on security teams by addressing vulnerabilities before deployment.

Conclusion

The Ancors static stick checker tool provides a valuable tool for developers to improve the security of their Docker images. By proactively addressing vulnerabilities during the development process, organizations can ensure their applications are secure and reduce the risk of security incidents


Here’s the images it pulled out:

First image taken from the video

Second image taken from the video

Third image taken from the video

Not bad! It could be better - getting the company name wrong, for one!

I can imagine using this to create a YouTube description, or use it as a skeleton from which a blog post could be created. I certainly wouldn’t just pipe the output of this into blog posts! But so many videos need better descriptions, and this could help!

April 26, 2024 09:00 AM

August 17, 2023

Martin Wimpress

Install ZeroTier on Steam Deck

How to persist software installation across SteamOS updates on the Steam Deck.

by Martin Wimpress (martin@wimpress.com) at August 17, 2023 11:15 AM

June 06, 2023

Martin A. Brooks

When contract hunting goes wrong: TEKsystems & Allegis Group

I was approached by a recruiter from TEKsystems who were looking for a Linux systems administration and automation type person for a project with one of their clients.  I took a look at the job description, and it seemed like a pretty good match for my skills, so I was happy to apply and for TEKsystems to represent me.

I was interviewed three times by members of the team I would be working in over the course of about two weeks.  The people were based in Sweden and Norway and, having previously lived in Norway, I felt brave enough to try out bits of my very very rusty Norwegian.  The interviews all seemed to go well and, a few days later, I was offered the role which I accepted.  A start date of May 15th 2023 was agreed.

I consider it a sincere and meaningful compliment when I am offered work, so it’s important to know that, in accepting this role, I had turned down three other opportunities, two permanent roles and one other contract.

As this role was deemed inside IR35, I would have to work through an umbrella company.  It’s usually less friction to just go with the agency’s recommended option which was to use their parent company, Allegis Group.  I duly went through their onboarding process, proving my address, identity, right to work and so on and so forth.  All pretty standard stuff.

As May 15th approached, I was conscious that I had not, as yet, received any initial onboarding instructions neither directly from the client or via the agency. Whom did I contact on the 15th, when and how?  As this was a remote work contract, I was also expecting delivery of a corporate laptop.  This had not yet turned up.

Late in the week before the 15th, I had a call from the agency saying that there had been some kind of incident that the team I would be working with had to deal with.  They had no-one available to do any kind of onboarding with me, so would I mind deferring the start of the contract by a week?

It turned out it was very convenient for me.  A friend of the family had died a few weeks earlier from breast cancer and the funeral was on the Friday beforehand and, as it happened, my wife and daughter also got stranded in France due to the strikes.  A couple of extra days free to deal with all of that were helpful, so I agreed and everyone was happy.

Towards the end of that week, there had still been radio silence from the client. The agency was trying to obtain a Scope Of Work from them which would lead to an actual contract being drawn up for signing.

The next Monday was a bank holiday and, on the Tuesday morning, I got this message from the agency.

Hello Martin

We would like to update you to confirm we are unable to continue with your onboarding journey, and as such your onboarding journey has now ceased.

We wish you all the best for your future assignments.

Many thanks,

OnboardingTeam@TEKsystems

Needless to say, this was rather surprising and resulted in me attempting to get in touch with someone there to discover what was going on.  No immediate answer was forthcoming other than vague mentions of difficulty with a Swedish business entity not being able to take on a UK-based resource.  I was told that efforts would be made to clarify the situation.  To the day of writing this, that’s still not happened.  Well, not for me at least.

At the end of that week, it became obvious that whatever problem had happened was terminal for my contract, so I started back contact hunting and reactivating my CV on the various job boards.

I asked TEKsystems if they would offer any kind of compensation.  I’d acted entirely in good faith: I’d turned down three other offers of work, told other agencies I was no longer available and deactivated my CV on the various job boards.  It seemed fair they should offer me some kind of compensation for the lost earnings, wasted time and lost opportunities.  They have declined this request leaving me entirely out of pocket for the 3 weeks I should have been working for them and, of course, unexpectedly out of work.

I’m obviously back looking for my next opportunity and I’m sure something will be along in due course.  This is a cautionary tale of what can go wrong in the world of contracting and, if your next contract involves TEKsystems or Allegis Group, you might wish to be extra careful, making sure they are actually able to offer you the work they say they are, and that you get paid.

by Martin A. Brooks at June 06, 2023 08:27 PM

May 01, 2023

Martin Wimpress

Steam Box vs Steam Deck

I declined my Steam Deck pre-order and I’m now playing more games on Linux

by Martin Wimpress (martin@wimpress.com) at May 01, 2023 05:38 PM

April 28, 2023

Martin Wimpress

July 10, 2020

Martin A. Brooks

Getting started with a UniFi Dream Machine Pro

It’s not an exaggeration to say that I’m an Ubiquiti fanboy. I like their kit a lot and my home network has been 100% UniFi for quite a few years now.

I’ve just moved in to a new home which I’m getting rewired and this will include putting structured network cabling in, terminating back to a patch panel in a rack in the loft. I have a small amount of “always on” kit and I wanted as much as it as reasonably possible to be in standard 19″ rack format. This is when I started looking at the Ubiquiti Dream Machine Pro to replace a combination of a UniFi CloudKey and Security Gateway, both excellent products in their own right.

My expectation was that I would connect the UDMP to some power, move the WAN RJ45 connection from the USG to the UDMP, fill in some credentials and (mostly) done! As I’m writing this down, you can probably guess it didn’t quite work out like that.

The UDMP completely failed to get an internet connection via all the supported methods applicable. PPPoE didn’t work, using a surrogate router via DHCP didn’t work, static configuration didn’t work. I reached out to the community forum and, in fairness, got very prompt assistance from a Ubiquiti employee.

I needed to upgrade the UDMP’s firmware before it would be able to run its “first setup” process, but updating the firmware via the GUI requires a working internet connection. It’s all a little bit chicken and egg. Instead, this is what you need to do:

  • Download the current UDMP firmware onto a laptop.
  • Reconfigure the laptop’s IP to be 192.168.1.2/24 and plug it in to any of the main 8 ethernet ports on the UDMP.
  • Use scp to copy the firmware to the UDMP using the default username of “root” with the password “ubnt”:
    scp /path/to/fw.bin root@192.168.1.1:/mnt/data/fw.bin
  • SSH in to the UDMP and install the new firmware:
    ubnt-upgrade /mnt/data/fw.bin

The UDMP should reboot onto the new firmware automatically. Perhaps because I’d been attempting so many variations of the setup procedure, after rebooting my UDMP was left in a errored state with messages like “This is taking a little longer..” and “UDM Pro is having an issue booting. Try to reboot or enter Recovery Mode”. To get round this I updated the firmware again, this time doing a factory reset:

ubnt-upgrade -c /mnt/data/fw.bin

The UDMP then rebooted again without error and I was able to complete the setup process normally.

It’s a bit unfortunate that UDMPs are shipping with essentially non-functional firmware, and it’s also unfortunate that the process for dealing with this is completely undocumented.

by Martin A. Brooks at July 10, 2020 06:07 PM

May 29, 2020

Martin A. Brooks

Letter from my MP regarding Dominic Cummings

I wrote to my MP, Julia Lopez (CON), asking for her view on whether Dominic Cummings had broken the law or not and if he should be removed from his position. Here is her response:

Thank you for your email about the Prime Minister’s adviser, Dominic Cummings, and his movements during the lockdown period. I apologise for taking a few days to get back to you, however I am in the last weeks of my maternity leave and am working through a number of tasks in preparation for my return.

I have read through all the emails sent to me about Mr Cummings and completely understand the anger some correspondents feel. It has been a very testing time for so many of us as we have strived to adhere to new restrictions that have separated us from loved ones, led us to make very difficult decisions about our living and working arrangements or seen us miss important family occasions – both happy and sad. Those sacrifices have often been painful but were made in good faith in order to protect ourselves, our families and the most vulnerable in the broader community.

Given the strength of feeling among constituents, I wrote to the Prime Minister this week to advise him of the number of emails I had received and the sentiments expressed within them, highlighting in particular the concern over public health messaging. Mr Cummings has sought to explain his actions in a press conference in Downing Street and has taken questions from journalists. While his explanation has satisfied some constituents, I know others believe it was inadequate and feel that this episode requires an independent inquiry. I have made that request to the Prime Minister on behalf of that group of constituents.

Mr Cummings asserts that he acted within lockdown rules which permitted travel in exceptional circumstances to find the right kind of childcare. In the time period in question, he advises that he was dealing with a sick wife, a child who required hospitalisation, a boss who was gravely ill, security concerns at his home, and the management of a deeply challenging public health crisis. It has been asserted that Mr Cummings believes he is subject to a different set of rules to everyone else, but he explained in this period that he did not seek privileged access to covid testing and did not go to the funeral of a very close family member.

I am not going to be among those MPs calling for Mr Cummings’ head to roll. Ultimately it is for the Prime Minister to decide whether he wishes Mr Cummings to remain in post – and to be accountable for and accept the consequences of the decision he makes – and for the relevant authorities to determine whether he has broken the law. Whatever one thinks of this episode, I think the hounding of Mr Cummings’ family has been disturbing to watch and I hope that in future the press can find a way of seeking truth without so aggressively intruding into the lives of those who have done nothing to justify their attention.

Thank you again for taking the trouble to share with me your concerns. I regret that we cannot address everyone individually but the team continues to receive a high number of complex cases involving those navigating healthcare, financial and other challenges and these constituents are being prioritised. I shall send you any response I receive from the Prime Minister.

Best wishes

Julia

by Martin A. Brooks at May 29, 2020 01:33 PM

August 22, 2016

Anton Piatek

Now with added SSL from letsencrypt

I’ve had SSL available on my site for some time using startssl, but as the certificate was expiring and requires manual renewal, I though it was time to try out letsencrypt. I’m a huge fan of the idea of letsencrypt, which is trying to bring free SSL encryption to the whole of the internet, in particular all the smaller sites who might not have the expertise to roll out SSL or where a cost might be restrictive.

There are a lot of scripts for powering letsencrypt, but getssl looked the best fit for my use case as I just wanted a simple script to generate certificates, not manage apache configs or anything else. It seems to do a pretty good job so far. I swapped over the certificates to the newly generated ones and it seems pretty smooth sailing.

by Anton Piatek at August 22, 2016 06:51 PM

October 05, 2015

Philip Stubbs

Gear profile generator

Having been inspired by the gear generator found at woodgears.ca I decided to have a go at doing this myself.

Some time ago, I had tried to do this in Java as a learning exercise. I only got so far and gave up before I managed to generate any involute curves required for the tooth profile. Trying to learn Java and the math required at the same time was probably too much and it got put aside.

Recently I had a look at the Go programming language. Then Matthias Wandel produced the page mentioned above, and I decided to have another crack at drawing gears.

The results so far can be seen on Github, and an example is shown here.

Gear Profile Example Image

What I have learnt

  • Math makes my head hurt.
  • The Go programming language fits the way my brain works better than most other languages. I much prefer it to Java, and will try and see if I can tackle other problems with it, just for fun.

by stuphi (noreply@blogger.com) at October 05, 2015 08:32 AM

June 22, 2015

Anton Piatek

Hello Pace

After leaving IBM I’ve joined Pace at their Belfast office. It is quite a change of IT sectors, though still the same sort of job. Software development seems to have a lot in common no matter which industry it is for.

There’s going to be some interesting learning, things like DVB are pretty much completely new to me, but at the same time it’s lots of Java and C++ with similar technology stacks involved. Sadly less perl, but more Python so maybe I’ll learn that properly. I’m likely to work with some more interesting Javascript frameworks, in particular Angular.js which should be fun.

The job is still Software Development, and there should be some fun challenges with things like allowing a TV set top box to do on demand video content when all you have is a one-way data stream from a satellite, for instance, which make for some interesting solutions. I’m working in the Cobalt team which deals with a delivering data from the TV provider onto set top boxes, so things like settings, software updates, programme guides and on demand content and even apps. Other teams in the office work with the actual video content encryption and playback and the UI the set top box shows.

The local office seems to be all running Fedora, so I’m saying goodbye to Ubuntu at work. I already miss it, but hopefully will find Fedora enjoyable in the long term.

The office is on the other side of Belfast so is a marginally longer commute, but it’s still reasonable to get to. Stranmillis seems a nice area of Belfast, and it’s a 10 minute walk to the Botanical gardens so I intend to make some time to see it over lunch, which will be nice as I really miss getting out as I could in Hursley and its surrounding fields.

by Anton Piatek at June 22, 2015 02:53 PM

June 04, 2015

Anton Piatek

Bye bye big blue

After nearly 10 years with IBM, I am moving on… Today is my last day with IBM.

I suppose my career with IBM really started as a pre-university placement at IBM, which makes my time in IBM closer to 11 years.  I worked with some of the WebSphere technical sales and pre-sales teams in Basingstoke, doing desktop support and Lotus Domino administration and application design, though I don’t like to remind people that I hold qualifications on Domino :p

I then joined as a graduate in 2005, and spent most of my time working on Integration Bus (aka Message Broker, and several more names) and enjoyed working with some great people over the years. The last 8 months or so have been with the QRadar team in Belfast, and I really enjoyed my time working with such a great team.

I have done test roles, development roles, performance work, some time in level 3 support, and enjoyed all of it. Even the late nights the day before release were usually good fun (the huge pizzas helped!).

I got very involved with IBM Hursley’s Blue Fusion events, which were incredible fun and a rather unique opportunity to interact with secondary school children.

Creating an Ubuntu-based linux desktop for IBM, with over 6500 installs, has been very rewarding and something I will remember fondly.

I’ve enjoyed my time in IBM and made some great friends. Thanks to everyone that helped make my time so much fun.

 

by Anton Piatek at June 04, 2015 10:00 AM

April 11, 2015

Philip Stubbs

DIY USB OTG Cable

Suddenly decided that I needed a USB OTG cable. Rather than wait for one in the post, i decided to make one from spare cables found in my box of bits.
Initially I thought that it would be a simple case of just cutting the cables and reconnecting a USB connector from a phone lead to a female USB socket. Unfortunately that is not the case.
The USB cable has four wires, but the micro USB plug has five contacts. The unused contact needs to connected to ground to make the OTG cable. The plug on the cable I used does not have a connection for the  extra pin, so I needed to rip it apart and blob a lump of solder on two pins. The body of the plug has a wall between each pin, so I rammed a small screwdriver in there to allow the soldered pins to fit.





I then reassembled the plug, and continued with the connecting the wires together. This was an easy case of , red to red, black to black, green to green and white to white. A piece of heat shrink covers the mess.
Now to use it. It allows me to plug a keyboard into my Nexus tablet. If I plug a mouse in, a pointer pops up. All of a sudden using the tablet feels like using a real computer. I am typing this with a keyboard on my lap down the garden with my tablet.
The real motivation for the cable was to allow me to use my phone to adjust the settings on my MultiWii based control board of my Quadcopter. For that, it seems even better than MultiWiiConf, and certainly a lot more convenient when out flying.

by stuphi (noreply@blogger.com) at April 11, 2015 04:31 PM

January 29, 2015

Philip Stubbs

Arduino and NRF24L01 for Quad-copter build

As part of my Quadcopter build, I am using a couple of Arduino's along with some cheap NRF24L01 from Banggood for the radio transmitter and reciever. The idea came from watching the YouTube channel iforce2d.

When I started developing (copying) the code for the NRF modules, I did a quick search for the required library. For no good reason, I opted for the RadioHead version. Part of my thinking was by using a different library from iforce2d, I would have to poke around in the code a bit more and lean something.

All went well with the initial trials. I managed to get the two modules talking to each other, and even had a simple processing script show the stick outputs by reading from the serial port of the receiver.

Things did not look so good when I plugged the flight controller in. For that I am using an Afro Mini32. With that connected to the computer and Baseflight running, the receiver tab showed a lot of fluctuations on the control signals.

Lots of poking , thinking, and even taking it into work to connect to an oscilloscope, it looked like the radio was mucking up with the timing of the PWM signal for the flight controller. Finally, I decided to give an alternative NRF library a try, and from the Arduino playground site, I selected this one. As per iforce2d, I think.

Well that fixed it. Although, at the same time I cleaned up my code and pulled lots debugging stuff out and changed one if loop to a while loop, so there is a chance that changing the Library was not the answer. Anyhow, it works well now. Just need some more bits to turn up and I can start on the actual copter!

by stuphi (noreply@blogger.com) at January 29, 2015 04:28 PM