IPFS

WebAssembly Gateway Interface (WAGI)+IPFS

When I first came across WebAssembly, I was already aware of the IPFS project, and immediately thought that the two would work well together. Specifically, I thought that WebAssembly modules stored in IPFS could be loaded automatically by an edge web server to allow for dynamic content without a centralized server and without requiring IPFS on the device utilizing the service. The latter is important for supporting legacy devices and services that cannot be updated.

Cryptographic Accumulator

I’ve been following this thread that mostly boils down to this: if I have the SHA-256 hash of a file, can I use that to get a file from IPFS? The short answer is not really?, because Merkle-DAG. The slightly-longer answer is that because Merkle-DAG is required to allow chunking files and verifying those chunks as they come in, and SHA-256 does not have a facility to combine hashes of two components into a single hash for the combined block, you can’t find files on IPFS my the hash of the entire file in a way where you can verify that each block belongs to that hash without having to download the entire file.

Arch on IPFS

I have been using the project described here for some time now to get Archlinux package updates, even going so far as to join the pinning cluster, but I’m not sure how much longer that is going to last. As of five days ago, the cluster has stopped getting updates, and according to the status page, the cluster is offline for the foreseeable future. The quick(ish) fix is to roll the IPFS software back from the development version of ipfs-0.

Wikipedia on IPFS

While doing a websearch for “steam ipfs”, I happened upon the website en.wikipedia-on-ipfs.org. I have known that there was a copy of Wikipedia made in 2017 for English, Kurdish and Turkish, but only today did I find out that somebody has registered a domain name for it. I looks like a recent development.

DNS in IPFS

Over the past couple of years, I’ve been thinking about things that could be used to replace parts of the internet and web we currently use. There are projects like cjdns that are looking to replace the network routing layer of the internet with a system that does not require a centralized authority to issue IP addresses. There are other parts of the web stack that are looking to be replaced (IPFS is one of them, looking to replace HTTP(S)), but the one I will be looking at in this post is the Domain Name System (DNS).

Git Repo Update 2

I have made another change to the git repo handling code so that when publishing, only the repos that have been updated are added to IPFS again. This way, large repos only slow down the publish process when they are updated and not every time any repository is updated. The new process is this: post-update hook adds its path to the spool directory Monitor process sees update and starts publish The existing repo is added to a temporary directory in the mutable file system (MFS) For each updated repo: The repo is added to ipfs without pinning The directory for that repo in MFS is removed and replaced with the new hash The old hash is unpinned and the new hash pinned The root hash of the new repo directory structure is published Remove temporary directory from MFS Now, git push is almost exactly the same speed as a plain ssh remote (only an additional flag set), the update is fast for small repositories and only slows down when processing a large repo.

Indexer Update

If you’ve been following my IPFS Scanner, you will have notices some changes today. I’ve added tags. There are also a good number of sites listed now, with widely varying levels of stability and content. Don’t blame me if there isn’t good content there: go make a site and make sure it is published with you node’s primary key (that would be ipfs name publish /ipfs/QyourSiteHashGoesHere). I’ve reworked the site generator to allow me to attach tags to every site in the index by /ipns/ key.

Git Repo Update

I have a set of git repos published to IPFS that I talked about before here. Since that post a month ago, the repos have grown in count and size to the point that it is no longer feasible to use the automatic publish as it was. I have made one change and found that I am required to make further changes for it to remain usable. The change already made is to no longer run the publish from the post-update hook and to instead have that create a file in a spool directory and have a separate process monitor the spool directory and launch the publish script.

Mirroring Dead Websites

Today I checked in on cjdns to see where the project was. It looked about like it did the last time I checked it several months ago. While reading doc/bugs/policy.md I noticed a link that went to archive.org because the original site was dead due to domain name expiration. Thinking about how IPFS could have kept sites like this alive even when the original owner abandoned the site, I dove down a rabbit hole and ended up with a mirror of the site.

Designed to Last

I was reading thru the latest posts at No Tech Magazine and saw this article on designing web pages to last a decade. A lot of what is covered very much applies to websites on IPFS. Here is the list from the article (which you should read): Return to vanilla HTML/CSS Don’t minimize that HTML Prefer one page over several End all forms of hot-linking Stick with the 13 web safe fonts +2 Obsessively compress your images Eliminate the broken URL risk Of these, #1, #2, #4, #5 and #6 absolutely apply to IPFS sites.