I'm testing with the new bucketed fetching of feeds. The algorithm is similar to algorithm Aperture uses. The difference is that the buckets are powers of 2 at the moment. Every tier is the exponent of 2 minutes, after the last fetch.
In the middle of a rewrite of the backend of Ekster. The next version will have a full Postgres-based backend.
I'm trying to add the "_source" field to items in Ekster. At the moment it adds the information from the FeedHeader when it is available.
What would I need to send the webmention notifications to Ekster? It seems that I need to post the webmentions as Micropub posts to a channel.

Interdependent Thoughts - Revisiting My Ideal Feed ReaderList of use cases for a feed reader.

Ekster happily created new channels with the same as existing channels. Now the channel name is unique and returns the current uid and name of that channel. It seems this currently isn't specified.

It does improve the scriptability of Ek. I can now follow feeds while creating the channel like this:

ek follow `ek channels Blogs` https://example.com/blog/

The channels call will return the uid of the created channel or the uid of the existing channels when it already exists.

Writing a Server Sent Events server in GoNice implementation for SSE in Go.

I created a new release of Ek(ster) which checks status codes in the HTTP responses of the microsub server. See version 0.8.3 on Github.

And as always please use the -verbose option to show the actual requests and responses. An example:

ek -verbose channels

This way you get the actual headers and body of the requests, which could help with debugging.

Ekster now contains a new feature. The backend now supports two types of Redis based timeline backends. The default is still the timeline based on sorted sets. The new type is based on streams. The notification channel will now use this new stream type. Not many entries are posted to this channel, but I will use it for more things in the future.
It's still amazing to me that these posts show up directly in my reader, seconds after I post them.

Import OPML files into Microsub servers

Some time ago I wrote a one liner to import OPML files. Now it has become a lot easier to import an OPML file with ek.

  ek import opml subscriptions.opml

It is also possible to export an OPML file.

  ek export opml > subscriptions.opml

With these two commands it becomes easier to import and export OPML files.

The format of the OPML files should match the structure of the Microsub channels and feeds. Microsub has a list of channels and each channel contains a list of feeds. If the OPML file contains feeds on the first level of the file, it will skip these. Channels in the second (or higher) level will also be skipped.

Retrieve all your subscriptions from a Microsub server

ek channels | awk '{print $1}' | xargs -n 1 ek follow
The new Microsub client can be found on GitHub. It may need some adjustments, but it will work if your Indieauth and Microsub endpoints support CORS.
This afternoon I wrote a Microsub reader. CORS is a problem with this. I can enable it on the Microsub server to make it work. Do I need another backend to make it work?

I have added the -verbose flag to ek. This makes it possible to see the HTTP requests that are sent to the Microsub server and the responses that are returned. It can be a tool when creating your own Microsub server of client.

I thought I didn't port mark unread in Ekster to the Redis backend. But it seems it's not spec-ed and I did not implement it.
I started with using "refs" in the microsub items. When it's available in the original feed, Ekster will try to add it to the refs array.
Today I improved the flow when logging in from micropub clients to Ekster. Now you continue with the Indieauth when you aren't logged in. Also now the app name and logo are shown when you login with a Micropub client.

Implementing Microsub yourself (part 1)

This was also posted to news.indieweb.org.

This was also posted to /en/indieweb.

Implementing Microsub yourself (part 1)

In this article I will try to show how you can implement a very simple version of Microsub yourself. 

Let's start

The protocol for Microsub consists of a number of actions. The actions can be provided with a parameter action to the microsub endpoint. When implementing a Microsub server it's possible to create a version of responses where you don't need to implement the full thing. It depends on what you want to use. At the moment we will only implement channels and timeline.

Simplified channels

For example the channels action provides 4 different functions in the full implementation.

  1. Get a list of the available channels
  2. Create a new channel with a name
  3. Update the name of a channel
  4. Delete a channel

A great way to start is to only return a fixed number of channels. That way you only implement function 1 and only return a successful response for functions 2, 3 and 4. Clients will work when you do this and it becomes a lot easier to implement.

As an example in PHP:

if ($_GET['action'] == 'channels') {
    $channels = [
        [ 'name' => 'Notifications', 'uid' => 'notifications' ],
        [ 'name' => 'Home', 'uid' => 'home' ],

    header('Content-Type: application/json');
    echo json_encode(['channels'=>$channels]);

Simplified timeline

The timeline action provides 1 function. There are 2 parameters that allow paging. For a simplified version this does not need to be implemented.

The timeline action should return a response that looks like this:

  "items": [
    { ... },
    { ... }
  "paging": {}

By leaving paging empty you signal to the client, that there are no pages available at the moment.

The items array should be filled with JF2 items. JF2 is a simplified version of Microformats 2 that allows for easier implementation by clients and servers. An example could look like this:

        "type": "entry",
        "name": "Ekster now supports actual Indieauth to the Microsub channels. It's now possible for example to connect with http://indiepaper.io and archive pages to a channel. But of course the possibilities are endless.",
        "content": {
            "text": "Ekster now supports actual Indieauth to the Microsub channels. It's now possible for example to connect with http://indiepaper.io and archive pages to a channel. But of course the possibilities are endless.",
            "html": "Ekster now supports actual Indieauth to the Microsub channels. It's now possible for example to connect with indiepapier.io and archive pages to a channel. But of course the possibilities are endless."
        "published": "2018-07-15T12:54:00+02:00",
        "url": "https://p83.nl/p/795"

If you return a list of the items from your microsub endpoint, you could see them in the client. Now the harder part is, gathering these items from feeds and websites and converting these to JF2.

Simplified Microsub endpoint

And create a file with the following code called endpoint.php in the web root of your website.

The code can be found here: endpoint.php

Add the following information to your <head> tag:

<link rel="microsub" href="https://yourdomain.com/endpoint.php" />

That's all there is to it. Now you can Login with Monocle.

I justed re-added filtering as a setting in Ekster. This allows for easy mentions from all feeds, because it will match all incoming items with a regexes and add the matching items.
Blocking items also works (but it's channel only)
Ekster now supports actual Indieauth to the Microsub channels. It's now possible for example to connect with indiepaper.io and "archive" pages to a channel. But of course the possibilities are endless.
Perhaps it's possible with some kind of introspection to copy fields from a microformats.Data object to a normal struct, just like how "encoding/json" works. You could create your own structs and copy those fields from the MF2 into your own data model.

Peter Stuifzand created a new issue for pstuifzand/ekster

Add set and hash of feeds to Redis to improve subscriptions

The subscriptions of the feeds, are kept in keys named "feed:<id>", but these are difficult to reference at the moment. There should be a set of "feed:<id>" items and a hash of urls to "feed:<id>". That way I can deduplicate feeds and make it easier to resubscribe.

Peter Stuifzand replied to pstuifzand/ekster issue #1

Resubscription was added, now I need to find some feeds with WebSub and see if those work.

Peter Stuifzand created a new issue for pstuifzand/ekster

Add tracking to channels (include from all)

In the original version I had a filter that added all items matching a regex to a channel. In the settings page there should be a place to add keywords to a channel (even if it has no feeds) and gather all items matching those keywords (or perhaps) regex.

Peter Stuifzand created a new issue for pstuifzand/ekster

Add filtering to feeds (exclude)

Should be able to add keywords to a channel, that will be filtered on. If an entry contains the keyword, it won't be added to the channel.
Using time.Time will by default not result in a nice serialization in Redis. Now I use unix time as int64, which is much easier to work with.
I did some updates on the WebSub part of my Microsub server Ekster. It now tries to subscribe to feeds and send resubscribes. Before it already received incoming posts.

I just made a change to Ekster that allows it to receive Micropub requests from Indiepaper. In a way this already worked, but only with source_id and JF2 request bodies. This change allows the auth token to be in the Authorization header and JSON micropub requests.

I really like the oldest-first ordering of posts of Ekster (my microsub server), but it means that when you have read everything, that I doesn't show the old posts anymore.
It always starts with showing the oldest unread post. I would like to have a way to show the older posts. This could be done with "before" in timelines, but it doesn't seem to be implemented in Monocle for example.

Peter Stuifzand replied to a post on publog.stuifzandapp.com

I love how easy Go makes using these interfaces. The server implements it (as a file and Redis backend) and client calls it (instructed from the command line). Even the HTTP part calls the server backend through this interface.
I just created a Microsub client for the command line. Find it at https://github.com/pstuifzand/microsub-server #indieweb #microsub

I just implemented part of the Feedbin API in my Microsub server. It seems that in the future it will be quite easy to add all kinds of backends and maybe even set them per channel or similar.

Add simple start of feedbin API proxy

There are many parts to the microsub spec. My server implements follow and preview. But Monocle doesn't, and Together does. Perhaps there is a place for external tools that help with following and add feeds to your microsub reader.

Microsub changes

Yesterday I made an improvement to support paging with ZADD and ZRANGEBYSCORE. This allows me to get range of entries based in the timestamp of the published date (converted to Unix timestamp). The problem is that the unread entries are still available in the list. It's hard to find the first unread entry in the list. That entry is the starting point of the list of entries for the first page of items.

I implemented the solution like this: keep two lists. One with all unread items and one with the read items. In principle an entry moves from one list to the other in a linear fashion, because that's the reading order. So now when there is no after or before argument the server can send the first twenty items of the list. The first and last item contain the next before and after values. Nice thing is that I now get unread count for free with ZCARD.

Yesterday I wrote a microsub server in golang. It contains most of the feature that I need to connect with the Together reader app.

Some observations:

- search is pretty strange to implement, it now only supports full urls.
- it does support automatic fetching of feeds
- it doesn't have caching support of items. It always fetches the full page
- no support for smart or automatic feed downloading, no WebSub. It should in the future
- mute and block are not implemented

Load more