Subscribe to Personal Log via RSS and find other blogs to read in my blogroll.

ubox MSX lib news

I finally moved ubox MSX lib from its GitLab home to my own infra, you can check ubox MSX lib via cgit.

I moved SpaceBeans back in June 2023 –as I wrote self-hosting git repos for now–, and I kind of put off moving my MSX project because having more users I thought it would be harder. But I was wrong!

SpaceBeans has binary artifacts, that I have to build and host somewhere, and back in GitLab, that was mostly automated and managed by CI (Continuous Integration). And in my mind, ubox MSX lib was that, and more –because the users–.

I was probably right about the users, but ubox MSX lib doesn’t have a binary that needs to be produced and distributed. Instead, the project’s output is just the source code, and in that case cgit has you covered using the refs view where you can download a zip or a tar.gz of any of the tags. And that’s all!

So the project is out of GitLab now. I put an announcement on the GitLab’s repo, and archived it so it is preserved read-only. And, obviously, things are going to work slightly different:

  • The project can be cloned only via https only with: git clone https://git.usebox.net/ubox-msx-lib.
  • The web interface to the repo is provided via cgit: ubox MSX lib tree view.
  • You can subscribe to new releases following this feed in your feed reader.
  • Contributions are now via email, or alternatively you can make them available on a public repo so I can pull from it.

The project home is the same, ubox MSX lib in usebox, with news and the documentation for easy access.

The only thing that is currently missing is a shared public channel for communication and collaboration, that previously was GitLab’s issues and merge requests. I know things can be more difficult now, for several reasons, but mainly because the way I want to work now is not following the mainstream forge model that you can see in GitHub, GitLab, Codeberg, and others –you can even self-host projects like Forgejo–.

Nothing is set on stone: if necessary, I could setup a mailing list somewhere. I’m not doing it for now because it doesn’t look to me like this project has that many contributions that the resource would be used.

I’m planning a 1.2 release, adapting the project to work with the latest SDCC and its new calling convention, which will be a big change because people using the older SDCC will stay in current 1.1.14 version.

Meanwhile, there is an active fork by robosoft, in case you want and advance of what that 1.2 could look like (also it includes some MSX2 related changes that may interest you as well). Let’s keep those MSX games coming!

Funco

I like programming languages, and since I attended University many years ago, I’m attracted to their design and implementation. For example, I talked here about Micro, which I think is my latest complete release on the topic.

But implementation of programming languages is complicated, and takes a long time. That’s OK, however I think I always make the same mistakes.

To start with, I tend to implement a language that is too big. I try to do everything “the right way(tm)”, which takes even longer, and in the case of compilers, when I get to the parts that I don’t have experience and I really should investigate more, I’m overwhelmed and out of energy.

Like, why did I start working on a compiler written on Haskell when I’m still learning Haskell. Not a great idea!

So last week I was busy and tired, and frustrated, so one night I started a new project to see what I could do in a few hours, with the following conditions:

  • It doesn’t have to be nice, or well done. For example: error reporting? where we are going we don’t need error reporting!
  • Build an interpreter first, we’ll see if it is worth adding code generation (compiler) later.
  • Use tools that I know well already.
  • Keep everything small, so it is easy to change direction without a lot of refactoring.

And that’s how Funco came to be. It is very small, written in Python (3.10 or later because I used match), it is only an interpreter taking from Python everything I could, and the user interface is very rough (raising exceptions on errors!). But it works, and it was very satisfying to write, even if it is not very useful other than helping me to solidify what I already knew.

It is inspired by lispy and Atto, and it looks like this:

# example of a factorial function
def fact(n acc)
    if =(n 1)
        acc
    else
        # tagging a tail call with "@" so it can be optimized
        @fact(-(n 1) *(acc n))
    end
end

# main is the entry point of a program
def main()
    display("Running fact(50)...")
    display(fact(50. 1))
end

# output:
# Running fact(50)...
# 3.0414093201713376e+64

It is functional, with no variables, and well… almost everything is a function –I excluded function definition and conditionals to make it easier to use–. It feels very Lisp, and the syntax is a bit Ruby-like (which is useful so get syntax highlighting).

You won’t find anything revolutionary in the code, but that wasn’t the point. I even implemented recursive tail call optimizations, because otherwise it wouldn’t be useful at all given that the loops are implemented with recursivity. For example:

# recursive for loop
def recfor(n fn)
    if >(n 0)
        fn(n)
        @recfor(-(n 1) fn)
    end
end

def main()
    recfor(10000 display)
end

Because there is no “return”, it is required to tag the tail calls with @ so the interpreter tries to optimise that call avoiding hitting a stack limit (and improving performance, although speed was never in my plans).

Perhaps I will put some more time to add nicer error reporting, just in case I can use this funco as a base for future experiments. Now that I have something small and easy to modify, it shouldn’t been that costly to make small experiments with code generation!

Edit (2024-04-29), I have added more examples. It is a toy language, but there is some brain teaser about writing these that I like.

Backdoor in upstream xz/liblzma

Long story short: someone added a backdoor to upstream xz/liblzma that will compromise a SSH server in some conditions. And it got to places, for example xz-utils in Debian unstable and testing (it has been reverted now).

There’s a great in-detail summary by Evan Boehs: Everything I Know About the XZ Backdoor. I recommend reading it, because there is a lot to learn from this situation.

For example: how many single maintainers are out there taking care of vital pieces of open source software, without help, and that may even be in a specially bad place personally?

Alan Cox was commenting on mastodon:

At a certain level I am amused that probably millions of dollars of careful espionage work has been pissed away by a fraction of a second delay in an exploit.

Far more of a problem though are systems that dynamically assemble stuff from latest versions of things. We can be sure that some maintainers of those thousands of tiny pieces are careful, reliable maintainers funded by various governments who if the call comes will use that trust to flip them for bad causes.

The first part makes reference to how the backdoor was detected –it made sshd slower and someone was looking–, and the second is one that has been bothering me a lot since I started working on the JVM professionally –with Scala–. It is common practice to update and include lots and lots of dependencies directly from different upstream providers, without appropriate scrutiny. Does it compile? Do your test pass? I don’t think people really read the changelog, and we are live!

Which makes sense. It isn’t possible to review all your dependencies, because that is how industrial software is built today. And as we can see, the fact that a distribution like Debian is providing your packages is not bullet-proof –although the issue was in unstable/testing, it never got to stable–, but I generally trust the distribution maintainers to do the right thing. If anything, it is another layer of security.

It is not if but when more things like these will happen, be it because the maintainers are overstretched and make an honest mistake –see Log4Shell as an example–, or because there’s malicious intent.

Connecting blogs

I wrote about the IndieWeb about two years ago and, as part of that post, I mentioned webmentions, as one of the protocols they promote.

I was thinking, how can I introduce the idea as simple as possible? We could compare it to other linkback mechanisms, but then I realised that perhaps not many people remembers what trackbacks or pingbacks are.

Webmention is currently a W3C recommendation –not quite a standard yet– that enables cross-site conversations. It is a simple protocol to allow your site to tell a different website that you mentioned one of its posts as a comment, like (or other types of responses, apparently).

In a way, is similar to the experience we have in social media: you know when someone replies to you, or likes, or quotes. The difference is that we own our website, so we move to a distributed and heterogeneous landscape, instead of centralised and uniform –all using the same social network–, in which we need to add the glue between us so we can share that information.

I was thinking about implementing it here, even if is not going to be easy because this blog is a static website, but then I remembered that in my old blog in Spanish –that I closed after 18 years online– had comments support, but very little use. I got some comments over the years, but mostly in the early days –around 2003–.

Comments helped to improve the posts, but they also connected blogs because it was common that the commenters had a blog as well, and as part of the comment they could provide a link to their blog. As you can imagine, the idea was eventually perverted –and somewhat ruined– because spam. Those comments where a cheap way to get incoming links to a website, and that helped with SEO –search engine optimisation– and positioning in the search engines’ results. The spam is bad, of course, but I’m also unhappy with the incentive: those sites wanted traffic because they had ads and that meant income.

In my blog I had to disable comments automatically on a post after a number of days, include simple captchas, do fancy stuff with cookies and sessions. Very messy, for little benefit –the occasional comment–. And trackbacks lasted even less than comments. I think I disabled them shortly after finishing my implementation because not many real people used them, and it was only spam.

This blog doesn’t have comments, although you can always send me an email if you want to comment anything, and some people have done that. Not often, but if I have received a handful of emails, that’s more comments that my old blog had in its last few years.

So I am undecided. Although I read blogs, I don’t seem to quote them often –perhaps I should, I should find my small blogosphere like I had in the early 2000s–. Would it be worth it to add support for webmentions here?

Sending them is easy, I can do it as part of the publishing step that renders the posts –in markdown format– using Hugo. Receiving webmentions will require a bit of extra work, and likely some non-intrusive Javascript to show them in the post somehow. I may give it a go for fun, I can always remove it if it turns out it was a bad idea.

In any case, I have the feeling this is something that should be widely supported, if blogs stand a chance against centralised social media.

Edit (2024-04-24), Alex writes on Micro-blogs:

I tried web-mentions and they didn’t work [how] I wanted them to. Far too few other blogs supported them, and for this blog, I didn’t know what to do with them. I didn’t want to send myself email so I turned them into comments, but mentions aren’t as strong a signal as a comment. Mentions aren’t public but comments are. Mentions don’t need a strong connection to the main article but comments do. By turning mentions into comments, I had made a mistake and the result was frustrating. So I got rid of them.

We had a conversation over email, and it really helped me to think about this.

On those emails I went full circle: from comments is not the answer to it was the comments all along! Although, perhaps, better than we had them back in the day. Probably using the Fediverse, so we have notifications and things like that, that was never a problem solved in the early 2000s –that’s my back in the day for this topic–.

I’m starting to think that what a blog needs to be connected with other blogs is comments. Alex quotes this take from James:

If your blog then has comments, and likes, and people on other platforms can comment, like, and share posts directly, is it really a blog any more? … Should blogs even have comments and notifications?

And I found this very interesting. The informal definition of blog I was more familiar with, included comments from readers that complemented the post, to the point that when some blogs –specially with a considerable audience– started to remove them, it was controversial: is that still a blog?

May be the likes and sharing posts is a bit too much, specially because if you add follows, that is social media. But comments were part of a blog, and that is when my chat with Alex ended in: of course, we just need comments to connect blogs!

Not sure if it would be used enough, unless I find my blogosphere, or if the spam would make the functionality useless and too costly to operate, but in the meantime –as Alex mentions in his post– we will have to do the work and link to other’s people blogs.

Have a blogroll

About 3 years ago I was wondering here how could we get the RSS back to what it was. It was a rhetorical question, and I didn’t provide an answer back then.

This is a topic going around the fediverse, and the usual suggestions are:

  • Have a personal website (or start a blog).
  • If you use RSS on your website, make it prominent so people know that it is there.
  • Have a blogroll, or a way to recommend the blogs you read.

The last point is important and, without thinking about it, I was neglecting it in this blog.

What is a blogroll exactly? According to the Wikipedia:

A list of other blogs that a blogger might recommend by providing links to them (usually in a sidebar list).

It also had a social component of this blogger reads me and has my blog in their blogroll, I will add theirs to mine. Which was a way of building your small blogosphere or community of blogs by sharing links. I made a few friends like that, and I still keep in touch with some of them after over 20 years. All because at some point we all had a blog.

Although I still haven’t found how to integrate it in the blog itself –I blame the mobile phones support for this, but it is all my fault–, I have now my public blogroll. It also generates automatically when I post and update the blog by processing ~/.config/liferea/feedlist.opml, because liferea is still my feed reader, so it should be up to date and reflect exactly what is that I’m reading.

In my old blog that run from 2003 to 2021, it had blogroll up to late 2009 –according to the wayback machine–, with a section on the right column suggesting blogs to read. On 2010 I started learning Python and, as an exercise, I rewrote my old PHP blog engine –using Tornado and Redis, in pure NoSQL hype–. At that point, I dropped the blogroll, but I can’t remember the reason.

I have the vague recollection of most blogs I used to read being inactive. Checking that blogroll from 2009, most of these blogs are either gone or stopped posting many years ago. Which is fair, I closed my old blog after all. A good blogroll has to be alive and updated to be useful.

Anyway, the idea of this post is to say: hey, I have a blogroll! You should have one as well, like the other cool kids. I still have to decide how I integrate it with the blog, even if most people read this pages via a feed reader!

Finding a free secondary DNS

I think I started hosting my domain because at some point the hosting provider I was using didn’t support all the record types I wanted to use –I have usebox.net since 2002–. Also, I was younger and had more free time. Go figure: I thought DNS was fun.

A haiku about DNS (from nixCraft’s blog):

It’s not DNS
There’s no way it’s DNS
It was DNS

Generally you used to get a secondary DNS service for free as part of the deal with your hosting company, as long as the primary was hosted on an IP in their network. It was like that –if I recall correctly– when I was with OVH, and later on with Memset –and I worked there for a few years, although it was acquired and things have changed–.

But now I’m with DigitalOcean and, as far as I can tell after looking at their docs, they only do secondary for you if you let them host the zone management. And obviously I don’t want to use their control panel and give up hosting the zone myself, or change hosting company –migrating services isn’t fun–.

Well, is not a problem because I have two servers anyway, so I can host primary and secondary, isn’t it? Not quite, because unfortunately I didn’t think this through and I ended with both servers in the same data center :facepalm:, which is not good as it is very well explained on the FAQ of PUCK Free Secondary DNS Service:

Q: Why do I need a secondary dns server?
A: You want to have (at least) TWO different DNS servers in two different physical locations. This will help you if your primary DNS server experiences a power outage or some sort of problem related to network connectivity.

So the question is: can I find a free secondary DNS service? Yes, I can! Although It wasn’t easy to get to that answer, probably because search engines have a difficult time providing relevant results these days.

There are a few for-profit possibilities, that give you some free stuff with some limitations; which can be too limiting to be useful. For example: the number of queries a month their servers will respond.

But some don’t have such limitations, two of them:

I decided to give FreeDNS a go. It provides an impressive number of free services, and it was very simple to setup.

I like their philosophy and the feel. If I had the need to pay for any of their services, I would do it with pleasure; as opposed to other limited services –not bad or even expensive–, that felt like a trial. Which is fair enough as their are for-profit, but when I got an email from one of those providers telling me that their servers wouldn’t respond to any more queries that month unless I paid them, it didn’t have the effect they expected –they could have sent the email before I was over the quota, isn’t it?–.

And I was delighted to also find PUCK Free Secondary DNS Service, which is a very simple service that they provide for free to “the community”. I don’t know the details, but I suspect it comes from the fact that nether.net used to be the largest public-access Unix systems with the least restrictions on the internet –according to Jared Mauch’s about page—. The domain was created on January of 1995, I’d love to hear that story!

The response times did look good, so I decided to use both, FreeDNS and PUCK:

$ dig NS usebox.net | grep -A 4 ';; ANSWER'
;; ANSWER SECTION:
usebox.net.             86400   IN      NS      puck.nether.net.
usebox.net.             86400   IN      NS      ns2.afraid.org.
usebox.net.             86400   IN      NS      ns1.usebox.net.
usebox.net.             86400   IN      NS      ns2.usebox.net.

Transistor

I don’t play a lot, and finishing a game is something that happens so exceptionally that I thought I would mention here that I finished Transistor.

This is not the first game I’ve played by Supergiant Games –I had played Pyre before, but didn’t get far on it–, and I had listened to different podcast episodes commenting their games. When I saw that Transitor was very discounted on GoG, and that it should play fine in my machine, I jumped straight in. And I loved it!

The Wikipedia article says Transistor is an action role-playing video game, but that sounds too generic and I’m not completely sure it fits this game. I would say it is more a turn based RPG, but I see that it can be played like it was an action one. For me it was more a strategy game: you plan your turn, Red –the main character– executes it, and then you need to wait until you can run another turn. The enemies act in real-time and, although Red can move, she is vulnerable until you can plan your turn again.

Transistor

It is a beautiful game

The world building and the mechanics on this game are fantastic. You fight against the process –some sort of computer entity that controls mechanical creatures and is invading the city–, and all the abilities your character gains in the game are “functions” in a programming mood that you can use in different forms: as a main move, as a modifier for an existing move, or as a passive move. As you gain more and more abilities, mixing those functions and knowing how to equip them to keep progressing is a real treat.

What I really loved in Transistor was the story: beautiful and tragic. The game looks amazing –isometric, all 2D!–, the voice acting is perfect, and the music elevates the whole package to a different level.

Is not easy, but not too hard either, because I’m not a great player and I managed to finish it. Is not a long game, but I found it a bit stressing and I played in small bursts of one to two hours. I think it was the right size for me, but even if I wanted it to be longer, I’m not sure how much they could have done considering the mechanics. The end affected me in a way that I didn’t feel like I wanted to do a game+ –as offered–, but may be some day now that I know the full story.

I have been playing more games, on and off, but nothing really for long enough to feel I wanted to mention it here. The last game I had finished was The Hand of Fate, which is quite different to Transistor!

Block ads and trackers

From Lichess’ block ads:

Lichess is safe, because it is free of ads and trackers.

But browsing the rest of the Internet exposes us to the following threats:

  • Advertisement sells our screen estate and influences us
  • Tracking sells our personal information to increase advertisement effectiveness

These don’t benefit us, the website users, in any way. In fact, they use a lot of our computing power and bandwidth against us.

Fortunately, there are simple and legal ways to protect ourselves from these invasions.

In case you don’t know Lichess, it is a free and open-source Internet chess server run by a non-profit organization of the same name; and you can go there and play chess vs other people. It is technically amazing –and the backend is written in Scala–.

The linked website on Lichess suggests different ways of blocking ads, but I would recommend that at least you should have an ad-blocker installed; ideally uBlock Origin because it is free, open source, and very good.

The situation with ads and tracking is getting to a point that I believe we must all take action. I don’t know if a future where ads are ethical and respectful of the users and their privacy is even possible, but current approach by big tech is indefensible and hostile to us all. I can’t even remember when I was not using an ad-blocker, but I truly believe that now it is important that you use one as well.

That’s why if you visit usebox.net main page and you don’t use an ad-blocker, the website will show a banner recommending you to install one.

Please block ads and trackers, it is completely legal and you must protect yourself when you browse the web. Don’t be intimidated by websites that want you to see ads and be tracked because their business model is completely broken and doesn’t respect you.

As a postscript, if you are using Chrome as your web browser, consider using Firefox instead, or any other browser that is not Chrome. There are plenty of reasons to ditch Chrome, protect yourself!

Retrogamedev IRC channel

I have been looking for IRC channels related to game development for a while, and I have found a few that are active, for example:

Then there are channels related to more or less retro-computing, that will discuss development (like #dosgameclub on AfterNET), but that was a bit hit and miss, or very focused on one specific system.

Long story short, all seems to be on Discord or, depending on the country, Telegram. So I was accepting defeat until I thought that perhaps I could start an IRC channel. And finally #retrogamedev on Libera happened.

Not that I have idea of IRC admin, but it is about hanging out and have a place to ask questions, share projects –including work in progress– and get inspired by fellow developers. Any system, from 8-bit to 64-bit –or whatever, as long as you consider it is retro enough–.

It is about socialising and building community around a very niche type of development.

Arguably we could have used XMPP or Matrix, but I feel like those two options may have less popularity than IRC or, at least, the subset of people already familiar with IRC and interested in retro gamedev may be larger.

So if you are interested on the topic, don’t hesitate to join us!

2023 Recap (sort of)

I used to do this type of post every Christmas, when I had my blog in Spanish, but for whatever reason is a tradition I didn’t port to this blog. Although I’m not sure I want to start doing it, 2023 has been weird in several ways and it may be interesting doing some reflection here.

A good example is that after leaving Twitter for Mastodon, turns out that the social network where I am most active is… IRC.

Weechat

My wife looking at my screen: is that... IRC?

I was a heavy IRC user on the mid to late 90s, but after that I only connected when I was working on an open source project and either I needed support or I was contributing and it was useful to be there –for example, when I was working with OpenStack Kolla–. For me, it was instant messaging with XMPP (Jabber back then) what replaced IRC, and I didn’t think I was going back. XMPP didn’t succeed the way I was expecting and that’s mainly why I’m only on Signal now–, but that is a story for a different day.

Where I am “active”? Mostly on Libera: #gamedev, and others channels like #haskell-gamedev or #cpc; and AfterNET: #dosgameclub and #ludumdare.

I also spent a lot of time on tilde, but because I was connecting to IRC via ctrl-c club; when I decided to connect directly from home because the club was kind of unstable, I dropped that network because there was no way to conceal your IP –unlike Libera via cloaking and AfterNet by default–, and I didn’t like it. I know it probably doesn’t matter, but there you are.

Although the channels are dedicated to specific topics, turns out people may talk about anything; which is kind of the opposite of Masto thinking about it: a social network about anything but we choose to focus on the topics we like. You find a lot of channels that aren’t very active, if at all, and sometimes there’s that user that is very annoying and that may even spoil the channel to the point of not being worth it; but that’s OK because you can just leave the channel if that’s the only thing you get from it.

The DOS Game Club podcast channel is a good example of good and healthy community, and links nicely with another thing I didn’t see coming this year until it happened: making DOS games. Also: one of the channel members lives in my same neighbourhood, which is a happy coincidence that allowed us to met for a beer!

This year I have released two very different games for DOS:

  • Gold Mine Run!: targeting the second age of DOS gaming; meaning 32-bit, VGA and Sound Blaster.
  • The Return of Traxtor: targeting early IBM PC/XT; meaning 16-bit, CGA and PC Speaker.

I’m happy with these two games, and I have released a library for DJGPP to make easier reuse the code I wrote for “Gold Mine Run!”. I started doing some bits with CGA/EGA, but I didn’t get too far. In any case, I think there are chances of more DOS games coming from yours truly.

I even streamed the development of one of the games, which was very intense because I tried to finish the game for a game jam, and there was no enough time. And then I stopped streaming.

I kind of like doing it, but I was making a big effort to ignore that I don’t like how both Twitch and YoutTube fund themselves with ads –and tracking users–. I investigated other options, like live streaming with Peertube or Owncast; but got to the conclusion that streaming video is expensive, and I’m not sure that what I stream is worth it. It is a bit like the cost of Mastodon, really –although it is much less, and I’m happy donating a small amount every month to the SDF social media efforts–. I may revisit my decision next year, but at the moment I don’t see a way forward with that.

I have worked on other things, but none of them got close to be on a finished state –most notably “Outpost” for the ZX Spectrum 48K that I really wanted to finish this year–. As I mentioned recently, I’m sharing my gamedev time with reading books, and that was a factor. Looking forward to see what will happen in 2024.

Outpost WIP

It is taking too long, but it will get there!

Finally, my return to old protocols didn’t stop with IRC. I also spent some times reading groups on Usenet; although I’m not posting often, so I guess I’m still not completely in. I’m keeping my notes on Newgroups up to date, in case anyone wants to take a look. Things move slow, but they still move!

Other than that, this blog keeps going, and posted a couple of times on my Gemini capsule –although I’m not spending time reading there, since last year–.

I have played games this year, but not long ones after I abandoned Persona 4 on the PS2 –another one to the unfinished pile–. But I discovered Lutris, and that has simplified gaming for the whole family. I linked our GoG account, and tinkering is over! Basically: if the game works on Linux, Lutris will make it work as optimally as possible, which is allowing us to play games I didn’t even know my humble PC could run. I still have a lot of games to play from itch.io –by the way, how disappointing has been their take on Masto–, but I don’t miss the time that sometimes take to get a game to run OK-ish!

My GoG library

Of course, I play one thing at a time -- also: Transistor is amazing!

Other than that, I have an Anbernic RG-350 that I bought in 2019 –and I haven’t used much– that now is my Pokemon Sapphire machine. Ready to pick up and play any time, I’m 13 hours in and it is good fun. I haven’t played many 8-bit games this year, old or new. Nothing has excited me enough anyway.

And I think this is enough for a recap, although I’m sure I must have left out some things. I guess if those were important, I should have written about them in the blog anyway.