Three cheese omelette
There's our ride. Going to Chicago.
A couple of days ago, I had a meeting with one of our clients to discuss the state of their website and what we could do to further improve it for their visitors. During this conversation, the client mentioned that they wanted to add a blog section, a place where they could inform their visitors about upcoming events, publish photos or videos they had taken during one of those events and where they could make other announcements as well.
The client went on and pointed out that all the content that should go into this blog already existed (and was continually being posted) on third party platforms like Facebook, Twitter, Flickr and Youtube. Being an advocate of the IndieWeb principles, this immediately got me excited, not least because I’ve never really come across this phenomenon on a client’s website before (at least not to this extent).
So I suggested to publish their content on their own site first and then syndicate copies out to the various silos they were already using. The client asked me, why they should do it that way. After all, they really enjoy the ease of use of the posting interfaces the various platforms are offering.
And I get that. It doesn’t take long to produce a Facebook post: you simply enter whatever message you want to share, upload some photos or even a video and you’re done. In a matter of minutes you published something on the web. But an important question to ask here is: for how long is it going to be there? Will people be able to access this information in a couple of months? In five years?
When it comes to publishing on the web, people tend to think: the web doesn’t forget, the Internet never forgets. Once you’ve put something online, it’s going to stay there forever. But that’s not necessarily true. The only thing that is keeping your content accessible over time is the URL. If the URL is gone, your content will be gone, too.
The problem with this is that if you don’t own the URLs your content lives on, then you don’t get to decide if will be available in the future or not. Companies like Facebook and Twitter do. And if any one of these services disappears one day, well then your content will most likely disappear with them.
One might argue that this is never going to happen. How could a company as large as Facebook possibly not be there one day? Well, it happened to a lot of companies on the web already. No one would have imagined that sites like Del.icio.us, Readability, Picasa, Google Code, Editorially, Google Reader, Posterous and Gowalla (just to name a few) would close one day. And yet they did.
When a site shuts down one might think: well, ok, the URLs may be gone, but the content I put online there wasn’t that important to me anyway, so who cares?
The assumption here is: your content isn’t that important to you. But in reality it might indeed be important to other people. While you might not care wether or not your content disappears from the web, other people most certainly do. They might have bookmarked it, they might have linked to it, hell they might even have written a book and put the URL in there as a reference.
If you publish your content on your own website, you can make sure that it will still be available in the future. That doesn’t mean that you shouldn’t be posting to third parties at all, but I would encourage you to first Publish on your Own Site, then Syndicate Elsewhere (POSSE). This way, you get to decide what happens to your content in the future and still can engage with the audience on a particular platform.
As for the client, I’m really looking forward to implement some of the IndieWeb principles on their website, not least because – them being a museum – I think longterm preservation is right up their alley.
Watching Mad Men appropriately.
I’ve been watching more and more people introducing offline capabilities to their websites so I decided to try and give that a go on my website as well. That meant I needed to add a service worker.
After reading Jeremy’s excellent post about his approach for using a service worker, I peeked through his code and implemented it here. Now when you visit my website (and your browser supports the use of service workers), an offline page and some static files get cached right away. As you browse the site, the pages you visit will be cached for offline viewing as well.
I also wrote a little module for Processwire (the CMS I’m using) which allows me to specify offline pages, static assets and some other options right from within the Processwire backend.
If you’re using Processwire as well and want to get started with your own offline experience, feel free to use the module as it is or tweak it to your needs. You can find it on Github.