Top 3 Beers of 2015 (So Far)

For summer 2015 I am predicting light, crisp tastes will be the big thing. For me, 2014 was the year of hazy, knockout IPAs like Le Castor's redoubtable Yakima, with its notes of mango, freshly cut flowers, and diesel fuel. 2014 was also the year of saisons, a varied and complex bunch, some of which were as heavy as the IPAs, while others were like blanches but with a bitter punch lurking behind mild banana bread.

So far in 2015, there have been encouraging signs everywhere, but Trou du Diable is still, pint for pint, the best brewery in Québec. Their new offerings rank #2 and #3 on my list. However, it's Dunham that truly distinguish themselves with the incomparable Cyclope. I'm happy for Dunham, particularly after a 2014 characterized by lack of innovation and a beer list that was starting to feel a little stale.

Cyclope (Dunham)

Cyclope

This IPA perfectly summarizes what I feel the tendancy will be for the summer months. By way of an evolution, la Cyclope is a lot drier and crisper an offering than its 2014 counterparts. The tropical fruit is gone, and you get a piney citrus in its place. Note that Cyclope is prominantly marketed for its mosaic and lemondrop hops, although apparently there is a "beta" build with completely different hops. Interested to try this.

L'Amère de Glace (Trou du Diable)

L'Amère de Glace

Champagne, pink grapefruit, bergamot, and a faint hint of shoe leather. A crisp and bitter lager in the pilsner style, but so astringent it makes Czech pilsners taste like Chips Ahoy. This is dry, dry, dry but at 4.5% it's an incredible value for a hot summer afternoon.

Aldred (Trou du Diable)

Aldred

Here's another one clocking in around 4.5%, which is interesting because last year Trou du Diable's batting average seemed to be up around 6.2%. This is a light, blonde ale. It doesn't make the dramatic entrance of L'Amère de Glace, instead serving as a mild, humble compliment that won't dominate the tastebuds. I think this is in bottles only, I haven't seen it on tap.

Nathan Barley, Apple, and the Fall of Music 2005–2015

Nathan Barley

On a recent flight, I rewatched the entire six-episode run of the Chris Morris hipster satire Nathan Barley. The show occupies a certain place in my heart, and as it turns out I can still recite huge chunks of its appalling dialogue.

Nathan Barley originally aired in 2005 on the UK’s Channel 4. The show’s eponymous character lives in a gentrifying east London neighbourhood and runs a web company that produces Vice-like “urban culture dispatches” aimed at hedonistic twentysomethings. Like many people I remember meeting during this era, Barley simultaneously dabbles in web design, DJing, and the worlds of art and photography, although only in service of getting invited to better parties. Barley is also an awful, grasping nitwit, and his vacuity—contrasted by his dour, jaded magazine columnist foil Dan Ashcroft—is the cringey comedic lynchpin of the series.

Barley has largely aged well, barring a few themes that would likely be handled with more sensitivity today. What’s amazing is how central music is to the show’s premise, and by extension to the premise of 2005-era hipsterism. Music is everywhere in Nathan Barley. Barley carries a proto-smartphone with built-in DJ decks. Characters attend music video launch parties, club nights, or simply make batshit electro at home like the Ashcrofts’ roommate Jones. Ten years ago the only hipsters were music hipsters, but nowadays there are fermentation hipsters, boardgame hipsters, knitting hipsters, and anything else you could imagine. The rise of social media and affordable services like Squarespace have made it unecessary to dabble in tech and other media simply to further some primary ambition, but is music really as central to being cool as it used to be?

I’m going out on a limb here to say that music in and of itself is less important today than it was in 2005, but there is also the possibility that it simply occupies a less dominant narrative position than it did. Perhaps recorded music was a kind of trojan horse, cracking open the youth market via the iPod before moving on to other things. (Single-mechanic free-to-play iPhone games? Game of Thrones? I have no idea what young people are interested in.)

Apple Music

WWDC 2015

At WWDC a couple of weeks ago, Apple launched its sprawling, multi-layered Apple Music offering. The idea here seems to have been a comprehensive response to Spotify (and to a lesser extent other streaming services) which Apple have so far failed to deal with in any meaningful way. Apple’s monetization of music, downloading a song or album for a flat price, seems painfully outdated and has remained unchanged for a decade, barring extensions like iTunes Match and outright missteps like Ping, all of which sought to add value to the music files you bought and downloaded.

People I’ve talked to about the Music portion of the WWDC keynote itself have had mixed reactions. The presentation felt meandering and overly long, with the obligatory celebrity tie-ins neither suitable for the audience nor useful in explaining the service Apple is offering. Personally, I find the premise of “human-curated” music spectacularly weak. I don’t buy the assertion that Apple really believe that algorithms simply aren’t sufficient to offer a pleasing music discovery experience, but I also wouldn’t belive you if you told me they had invested in an algorithm either. Curation is just not an experience Apple have ever succeeded at and not an experience I think they really care that much about. I think music discovery in 2015 occurs everywhere, and often through social networks Apple doesn’t own. I also think that for the tiny minority of (mostly young, urban) hipsters who are deeply, deeply invested in discovering new music, Apple’s offering will be cumbersome, out-of-date, and deeply uncool.

As one slightly bewildered commentator put it, “Did Apple… just… put radio on the internet?” From a leader in the music space ten years ago to a deeply uncool dadrock discovery engine. What is this all about? Are Apple really that bad at anticipating the needs of its music consumers, or is it that music itself simply doesn’t mean what it once did?

Exit Music

In one way, Music seems to have been battlefield on which almost every technological advancement in consumer entertainment is initially fought. From the radio age to the Ed Sullivan Show’s live Beatles broadcast to MTV’s music-as-advertisement-for-itself. Onwards to file sharing, better compression, more bandwidth, and finally the iPod, which provided a gold standard for an in-pocket device that prefigured the iPhone in more ways than the actual cellular phone did.

Is it that, ten years ago, being a hipster simply meant being into music and technology? Is it just that everyone’s either a hipster of some kind or else simply into technology in ways that weren’t possible in 2005?

Or, is it that music is now so deeply integrated into every other entertainment format that it’s absolutely everywhere and yet nowhere to be found at the centre of anything? It’s the halftime show and background fodder on YouTube, but perhaps music itself is simply no longer the centre of our cultural lives.

My "Email Workflow"

"Checking one's email," taken as a set of personal and unspoken/unwritten practices for managing communication and the artifacts left behind by communication, is rarely discussed anymore. Ten years ago, before Twitter and Slack, it was fashionable to discuss one's own personal workflow for dealing with the metastisizing fungal bloom of email that seemed poised to end the world with a single Reply All...

This all seems relatively quaint now, like Victorians discussing the proper etiquette for reading broadsheets on public transport, but back in 2007 Merlin Mann, Inbox Zero, and the GTD school of time management loomed pretty large in my world. By 2007 I remember trying to control Mail.app exclusively with the keyboard using custom actions in Quicksilver. (Remember poor old Quicksilver?) Email notifications—in the way we think of them nowadays—were relatively new (remember dear old Growl?) and I wanted to immediately deflect incoming communication into a folder without taking my fingers off my PowerBook G4 keyboard. Like I say, relatively quaint.

But the truth is that email is still a huge problem. It's a set of protocols and implementations and hacked workflows that have largely outstayed their welcome. Because half the world still use only email for every business communication, the rest of us just sort of have to find a way to manage our inboxes.

Recently, a colleague accidentally screenshared his inbox to howls of laughter from the assembled team. He had over 500 unread emails—hundreds of which probably came from ticketing and project management software bots—and a convoluted folder structure he probably diligently tries to maintain.

As I thought about my own process, I was reminded of another recent situation where dryly and dispassionately documenting the numbered steps in a previously undocumented workflow resulted in an unexpectedly clear understanding of exactly how poor an experience it represented.

My "Email Workflow"

  1. Open email on device. I have three folders. Inbox, Archive, and Trash.
  2. Sweep notifications generated on behalf of other systems. 99% of these can be deleted outright. If an automated reminder was generated that explicitly requires an action within the next 1–8 hours, it stays.
  3. Inspect and accept/decline meeting invitations.
  4. Sweep and archive (but not delete) all messages for which I'm a carbon copy but am only peripherally involved in the project. Based on the sender and subject, this is usually pretty easy.
  5. Sweep and archive any messages for projects in which I'm directly involved, but for which no action is required in this email thread. Optionally send quick thank yous to close off threads.
  6. For any messages that contain new information about a project (URLs for staging servers, delivery dates, etc.) IMMEDIATELY copy this information into Evernote and tag it.
  7. Sweep in order to keep only the most recent or most actionable thread pertaining to a given project/client/XYZ in my inbox and archive any older ones.
  8. At this point, almost every item in my inbox is meaningfully related to a possible next action and can help me identify who to communicate with next.

What is it Worth?

For the record, even after writing it down, I still think it's pretty sound. But the question is, do I need this at all? If my email is simply the detritus of other people's decision to keep using email, what is my continued diligence really buying me?

Deep UX Trolling

Eli Schiff retweets fall into a weird zone where I can't tell whether it's a genuine retweet ("I agree with this, and am therefore retweeting") or one of those joke retweets ("look at this daft thing someone else said"). At best, there's a baffling, written-by-a-robot quality to Schiff's posts. At worst, a kind of screwball Ayn Rand undergraduate troll quality that I find unsettling. Here's an excerpt from, ahem, "Fall of the Designer Part III: Conformist Responsive Design":

Ethan Marcotte, who initiated the responsive design movement, posited that there are three core pillars to the philosophy: "Fluid grids, flexible images, and media queries." Marcotte argued that "Thinking of design and implementation as separate concerns impacts the quality of both." He is right that the two are intimately related, however they are still distinct modes of thought.

Unfortunately for Marcotte, his responsive design techniques unintentionally led exactly to the separation and abandonment of visual design principles in the interests of putting implementation first. Today, responsive techniques allow design practitioners and engineers to argue that the centrally important aspect of digital design is whether it adapts to multiple screens using fluid layouts to the exclusion of any other need.

Thus application design has suffered greatly from lackluster responsive and mobile-first approaches. Instead of optimizing designs to each platform and usage paradigm, now designs tend to be one size fits all.

As far as I know, responsive design has never been one particular "thing." It's certainly not a monolithic and totalizing narrative that was authored by any specific person nor did it lead to the abandonment of anything in particular. In the context in which Schiff is referring to it, it's a set of web development trends and standards, constantly changing and arrived at tentatively. But responsive design, really, is just an attitude that reflects an overall turn toward user experience and platform agnosticism in tech business and service delivery.

The turn toward user experience, felt most strongly since 2010, certainly does have a few defining characteristics, but they're complex, contingent, economically-driven.

  1. There is an acknowledged need to provide a starting- or end-point for most experiences in a web browser. This is a good thing.
  2. More than ever, users want or need to interact with a single application or service across multiple devices, in both native and non-native formats.
  3. Users don't primarily engage with something based on whether the experience is native, non-native, or just a web browser. They don't care. They want the service or product.
  4. Tech business is no longer tech business. We're talking about transportation, clothing, and "service provider" startups we never thought possible. Uber? Airbnb? If you're selling a service like these, you better offer a consistent user experience.
  5. In the past, you developed your application for platform XYZ, and then people who were invested in platform XYZ came shopping for applications that ran on platform XYZ. Maybe they chose yours. Well, the 90s are over.
  6. "The Enterprise" is finished. We're a handful of retirement parties away from the end of the dominant software business paradigm of the 1980s and 1990s.

More Eli Schiff:

The vast adoption of the hamburger menu in web as well as in native mobile and desktop application design is proof of the misguided thinking that responsive thinking promotes. Instead of finding the ideal solution for each platform, the designer intent on implementing boring solutions can just apply the hamburger menu to each platform and call it a day.

Hamburger icons are controversial. But, I assure you, the ubiquity of the hamburger menu is not something that came at the expense of "finding the ideal solution for each platform." In fact, it is or was a genuine attempt to present the user with a consistent set of interactions. It's about helping the user find the menu options instead of forcing them to navigate some bizarre custom animation on an app-by-app basis, like the faux-sunrise HUD with blinking orbs I saw recently. It's about reigning in and finally putting to bed the excesses of the last decade's UIs. It's about putting the user first.

I'm old enough to remember a time when nobody considered what users wanted, or maybe they didn't know themselves. What mattered in those days was whether you could sell something to an IT manager or not.

It is therefore a mistake to entirely attribute the shift towards flat design to responsive techniques. In truth, responsive design was simply a convenient catalyst that allowed OS makers, developers and designers to obfuscate the need to pursue a principled practice of visual design. Responsive design was indeed one of the proximate causes of flat design's onset, but adherence to modern minimalist ideology was the ultimate cause. Responsive techniques are undoubtedly crucial to providing a multi-platform design, but they should not come in the way of a user-centered experience.

WHAT THE SHIT.

A Job For Life: The Workplace Economics of Dungeoneering 1974–2015

In the March issue of Harper’s, Esther Kaplan has an interesting article about the looming horror of the data-driven workplace:

In industry after industry, [data collection] is part of an expensive, high-tech effort to squeeze every last drop of productivity from corporate workforces, an effort that pushes employees to their mental, emotional, and physical limits; claims control over their working and nonworking hours; and compensates them as little as possible, even at the risk of violating labor laws.

Almost anybody who has worked in retail or hospitality over the past 10 or 20 years has been subject to Just in Time shift scheduling. In these and other sectors, employees are routinely kept in a perilous state of “on-call” that, above all else, seeks to reduce labour costs for the business. Kaplan’s article is also interesting for its examination of how Big Data has begun to affect other sectors as well. (I’m sure we all suspected that UPS drivers are subject to some scary metrics but, well, now we really know.)

I've blogged before about how the gameplay of Darkest Dungeon conceals a representation of the shift in the tech and gaming sectors toward lean, iterative design. The game itself reduces the sprawling, Tolkien-esque fantasy waterfall quest to short bursts of focused teamwork and brutal financial compromises.

I’ve also tweeted about how the early popularity of Dungeons and Dragons may have been in part due to its ability to offer teens an opportunity to preemptively master the structure, meetings, and paperwork of the 1980s workplace.

After reading Kaplan’s article, I’d argue that the Just in Time dungeoneering of Darkest Dungeon doesn’t just reflect changes in software development, but wholesale changes in the working lives of ordinary people in almost every imaginable workplace. By examining different representations of dungeoneering (on the tabletop and on-screen) over the past 30 years we can see the slow but inexorable change from stable, reliable employment to pernicious zero-hour contracts.

Consider this passage, particularly after your favourite Darkest Dungeon hero has been bludgeoned to death only to be replaced by another virtually identical highwayman:

In postwar America, many retailers sought to increase profits by maximizing sales, a strategy that pushed stores to overstaff so that every customer received assistance, and by offering generous bonuses to star salespeople with strong customer relationships. Now the trend is to keep staffing as lean as possible, to treat employees as temporary and replaceable, and to schedule them exactly and only when needed.

In the 1980s, the dominant narrative of the adventuring party was such that, at the end of a noble quest, fame and fortune were almost certainly available to any hero. Adventuring parties were often fixed groups of individuals for which a slow but steady rise in level and salary were guaranteed. Permadeath was rare and, with a little patience, rewards were plentiful. Consider the adventuring party and game mechanics presented by JRPGs prior to Final Fantasy VII.

This seemed to change in the 1990s, in which the idea of employee interchangeability and flexible scheduling began to encroach. On a recent playthrough of Baldur’s Gate (ported to iOS in 2012) I was struck by how the adventuring party I started with was dramatically different a few short hours later. Heroes came and went, often replaced on a whim or due to some brief situational advantage. There is also the notable death of Aeris in Final Fantasy VII, the first time I can remember in which a playable character well, died, like for real.

In recent years, tabletop gaming representations of dungeoneering have centred on an anachronistic and conservative misreading of early D&D that emphasizes mechanical difficulty and permadeath, despite the fact that almost nobody actually played D&D this way during the 80s. In the recently-published Torchbearer, dungeoneering is so brutal it figures as a sort of morbid entry-level service job. Darkest Dungeon is not far behind (but admittedly I try my hardest to keep my heroes fed, happy, and stress-free).

If dungeoneering isn’t profitable anymore, what does this say about our day jobs? If it’s no longer possible to reach the end of the dungeon among the same group of noble companions we started with, how are our relationships with each other changing to match?