Matt Mullenweg v. WP Engine

Automattic CEO and WordPress co-developer Matt Mullenweg published a post on September 21 calling WP Engine a “cancer to WordPress”. For the uninitiated: WP Engine is an independent company that provides managed hosting for WordPress sites; WordPress.com is owned by Automattic and it leads the development of WordPress.org. WP Engine’s hosting plans start at $30 a month and it enjoys a good public reputation. Mullenweg’s post however zeroed in on WP Engine’s decision to not record the revisions you’ve made to your posts in your site’s database. This is a basic feature in the WordPress content management system, and based on its absence Mullenweg says:

What WP Engine gives you is not WordPress, it’s something that they’ve chopped up, hacked, butchered to look like WordPress, but actually they’re giving you a cheap knock-off and charging you more for it.

The first thing that struck me about this post was its unusual vehemence, which Mullenweg has typically reserved in the past for more ‘extractive’ platforms like Wix whose actions have also been more readily disagreeable. WP Engine has disabled revisions but as Mullenweg himself pointed out it doesn’t hide this fact. It’s available to view on the ‘Platform Settings’ support page. Equally, WP Engine also offers daily backups; you can readily restore one of them and go back to a previous ‘state’.

Second, Mullenweg accuses WP Engine of “butchering” WordPress but this is stretching it. I understand where he’s coming from, of course: WP Engine is advertising WordPress hosting but it doesn’t come with one of the CMS’s basic features, and which WP Engine doesn’t hide but doesn’t really advertise either. But I’d hardly call this “butchering”, much less in public and more than a decade after Automattic invested in WP Engine.

WP Engine’s stated reason is that post revisions increase database costs that the company would like to keep down. Mullenweg interprets this to mean WP Engine wants “to avoid paying to store that data”. Well, yeah, and that’s okay, right? I can’t claim to be aware of all the trade-offs that determined WP Engine’s price points but turning off a feature to keep costs down and reactivating it upon request for individual users seems fair.

In fact, what really gets my goat is Mullenweg’s language, especially around how much WP Engine charges. He writes:

They are strip-mining the WordPress ecosystem, giving our users a crappier experience so they can make more money.

WordPress.com offers a very similar deal to its customers. (WordPress.com is Automattic’s platform for users where they can pay the company to host WordPress sites for them.) In the US, you’ll need to pay at least $25 a month (billed yearly) to be able to upload custom themes and plugins to your site. All the plans below that rate don’t have this option. You also need this plan to access and jump back to different points of your site’s revision history.

Does this mean WordPress.com is “strip-mining” its users to avoid paying for the infrastructure required for those features? Or is it offering fewer features at lower price points because that’s how it can make its business work? I used to be happy that WordPress.com offers a $48 a year plan with fewer features because I didn’t need them — just as well as WP Engine seems to have determined it can charge its customers less by disabling revision history by default.

(I’m not so happy now because WordPress.com moved detailed site analytics — anything more than hits to posts — from the free plan to the Premium plan, which costs $96 a year.)

It also comes across as disingenuous for Mullenweg to say the “cancer” a la WP Engine will spread if left unchecked. He himself writes no WordPress host listed on WordPress.org’s recommended hosts page has disabled revisions history — but is he aware of the public reputation of these hosts, their predatory pricing habits, and their lousy customer service? Please take a look at Kevin Ohashi’s Review Signal website or r/webhosting. Cheap WordPress in return for a crappy hosting experience is the cancer that’s already spread because WordPress didn’t address it.

(It’s the reason I switched to composing my posts offline on MarsEdit, banking on its backup features, and giving up on my expectations of hosts including WordPress.com.)

It’s unfair to accuse companies of “strip-mining” WordPress so hosting providers can avail users a spam-free, crap-free hosting experience that’s also affordable. In fact, given how flimsy many of Mullenweg’s arguments seem to be, they’re probably directed at some other deeper issue — perhaps what he perceives to be WP Engine not contributing enough back to the open source ecosystem?

Blogging at NYTimes and The Hindu

I’m going to draw some parallels here between the The New York Times and The Hindu in the context of Times’s decision to shut or merge up to half of its blogs (Disclosure: I launched The Hindu Blogs in December 2012 and coordinated the network until May 2014). This is not about money-making, at least not directly, as much as about two newspapers faced with similar economic problems at vastly different scales confronting the challenges of multi-modal publishing. Times’ decision to move away from blogs, which was brought to wider attention when Green went offline in March 2013, is not to be confused with its rejection of blogging. In fact, it’s the opposite, as Andrew Beaujon wrote for Poynter:

Assistant Managing Editor Ian Fisher told Poynter in a phone call: “We’re going to continue to provide bloggy content with a more conversational tone,” he said. “We’re just not going to do them as much in standard reverse-chronological blogs.”

This is mixed news for blogs. The experimental quality in the early days of blogging – which blogs both fed and fed off – is what inspired many post formats to emerge over the years and compete with each other. This competition was intensified as more news-publishers came online and, sometime in the late 2000s, digital journalism knew it was time for itself to take shape. The blog may have been fluidly defined but its many mutations weren’t and they were able to take root – most recognizably in the form of Facebook, whose integrated support for a variety of publishing modes and forums made the fluidity of blogging look cumbersome.

The stage was set for blogs to die but in a very specific sense: It is the container that is dying. This is good for blogs because the styles and practices of blogging live on, just the name doesn’t. This isn’t only a conceptual but also a technical redefinition because what killed blogs is also what might keep the digital news-publishing industry alive. It’s called modularization.

The modular newsroom

While I was at The Hindu, I sometimes found it difficult to think like the reader because it was not easy to forget the production process. The CMS is necessarily convoluted because if it aspires to make the journalist’s life easier, it has to be ‘department’-agnostic: print, online, design and production have to work seamlessly on it, and each of those departments has a markedly distinguished workflow. There is that obvious downside of ponderousness but on such issues you have to take a side.

One reason Beaujon cites for Times’ decision is their blogs’ CMS’s reluctance to play along with the rest of the site’s (which recently received a big redesign). I can’t say the problem is very different at The Hindu. In either institution, the management’s call will be to focus the CMS on whichever product/department/service is making the biggest profits (assuming one of them does that by a large margin) – and blogs, despite often being the scene of “cool” content, are not prioritized. The Schulzbergers have already done this by choosing to focus on one product while, at The Hindu, Editor Malini Parthasarathy has in the last two months ramped up her commitment to its digital platform with the same urgency as could have been asked of Siddharth Varadarajan had he been around.

The reason I said the demise of the container was also technical because, in order to keep a department-agnostic CMS both lightweight and seamless (not to mention affordable), larger organizations must ensure they eliminate redundant tasks by, say, getting a “print” journalist to publish his/her story online as well. Second, the org. must also build a CMS focused on interoperability as much as intra-operability. Technically speaking, each department should be an island that communicates with another by exchanging information formatted in a particular way or according to some standards.

Fragmenting the news

This is similar to blogs because the fragmentation that helped make it popular is also what has helped establish its biggest competitors, like tumblelogs, Twitter, Snapchat, Pinterest, etc., and each of these modes in turn are inspiring new ways to tell stories. A more modularized newsroom in the same vein will be able to tell different kinds of stories and be more adaptive to change and shock, not to mention better positioned to serve the fragmenting news. Better yet, this will also give journalists the opportunity to develop unique workflows and ethos to deal specifically with their work. That’s one thing that doesn’t bode well for the unfortunate blogs at the Times: “reintegration” is always accompanied by some losses.

Through all of this, anyway, the good name of “news-site” might become lost but we mustn’t underestimate our readers to not be able to spot the news under any other name.

However, this is where the similarities between the two organizations do end because they operate in drastically different markets. While traffic on both sites mostly entered ‘sideways’, i.e. from a link shared on the social media or on the site homepage instead of from the blogs landing page, what it did for the site itself is different. For one, among the people The Hindu calls its audience, purchasing power is way lower, so the symmetry that the Times might enjoy in terms of ad rates in print and on the web is just almost-impossible to achieve in India. This makes the battle to optimize UX with income grittier within Indian publications. The quality of the news is also nothing to write home about, although there is reason to believe that is changing as it’s less shackled by infrastructural considerations. Consider Scroll.in or Homegrown.

These are, of course, nascent thoughts, the knee-jerk inspired by learning that the Times was shutting The Lede. But let’s not lament the passing of the blog, it was meant to happen. On the other hand, the blog’s ability to preserve its legacy by killing itself could have many lessons for the newsroom.

Building the researcher’s best friend

One of the most pressing problems for someone conducting any research on personal initiative has to be information storage, access, and reproduction. Even if you’re someone who’s just going through interesting papers in pre-print servers and journals and want to quickly store text, excerpts, images, videos, diagrams, and/or graphs on the fly, you’ll notice that a multitude of storage options exist that are still not academically intelligent.

For instance, for starters, I could use an offline notepad that has a toggle-equipped LaTex-interpreter that I could use to quickly key in equations.

So, when I stumbled across this paper written by Joshi, et al, at Purdue University in 1994, I was glad someone had taken the time and trouble to think up the software-architecture of an all-encompassing system that would handle information in all media, provide options for cross-referencing, modality, multiple authors, subject-wise categorization, cataloguing, data mining, etc. Here’s an excerpt from the paper.

The electronic notebook concept is an attempt to emulate the physical notebook that we use ubiquitously. It provides an unrestricted editing environment where users can record their problem and solution specifications, computed solutions, results of various analyses, commentary text as well as handwritten comments.

The notebook interface is multimodal and synergetic, it integrates text, handwriting, graphics, audio and video in its input and output modes. It functions not only as a central recording mechanism, it also acts as the access mechanism for all the tools that support the user’s problem solving activities.

(I’d like to take a moment to stress on good data-mining because it plays an instrumental role in effecting serendipitous discoveries within my finite corpus of data, i.e. (and as a matter of definition) if the system is smart enough to show me something that it knows could be related to what I’m working on and something that I don’t know is related to what I’m working on, then it’s an awesome system.)

The Purdue team went on to implement a prototype, but you’ll see it was limited to being an interactive PDE-solver. If you’re looking for something along the same lines, then the Wolfram Mathematica framework has to be your best bet: its highly intuitive UI makes visualizing the task at hand a breeze, and lets you focus on designing practical mathematical/physical systems while it takes care of getting problems out of the way.

However, that misses the point. For every time I come across an interesting paper, some sections of which could fit well into a corpus of knowledge that I’m, at the time, assimilating, I currently use a fragile customization of the WordPress CMS that “works” with certain folders in my hard-drive. And by “works”, I mean I’m the go-between semantic interpreter – and that’s exactly what I need an automaton for. On one of my other blogs – unnamed here because it’s an online index of sorts for me – I have tagged and properly categorized posts that are actually bits and pieces of different research paths.

For products that offer such functionalities as the ones I’m looking for, I’m willing to pay, and I’m sure anyone will given how much more handy such tools are becoming by the day. Better yet if they’re hosted on the cloud: I don’t have to bother about backing up too much and can also enjoy the added benefit of “anywhere-access”.

For now, however, I’m going to get back to installing the California Digital Library’s eXtensible Text Framework (CDL-XTF) – a solution that seems to be a promising offline variant.