Vishnu Gopal

blog.vishnugopal.com

Amateur Fiction at its Finest

Home Library

I picked up my reading habit from my mum. I read everything from Sheldon, Christie, King and the like to Yeats, Whitman, Neruda, Donne, Dickens, et. al. It sure helped that mum had a huge library of her favourites (yes, that’s the picture above) and I read most of them twice over before I left school.

It was also in school that I started to write myself. When you’re an amateur author at that age, you really like having somebody read your work. These other people more often that not turn out to be amateur authors themselves, and then of course there’s a quid pro quo. That’s how I started started to read unpublished writing, and while over the years I’ve lost the muse, I still read a ton.

When I say this to other readers, they often look at me mystified. Amateur writing, seriously? Is that like fan fiction? Where do I get to read them? Don’t most of them require a spell check & a good editor? To the last: well, yes. But amateur writing is in an electronic format, and that brings about with it some unusual advantages. Let’s banish the negatives from our mind, and take a look at the upside:

  • Let’s take length first. Who decided that a novel should be 40K words plus? How about a 500K mammoth saga? Or a 300 word scene? Some of these lengths bring about some wonderful stories & different ways to consume them.
  • Controversial & sidelined genres such as polyamory, serious examinations about bending legal age, or even religious & political writing find a place in amateur writing. There is nobody to judge.
  • Unusual genres, such as self-insertion, mind-control, time-travel, and gender-bending novellas are cool. Who doesn’t need more 70s SciFi?
  • Crossovers: because of copyright, this has been largely unexplored in a mainstream genre. But wouldn’t you, as an author like to build upon the worlds of Tolkein & Rowling? Perhaps combine them, mix and match?
  • Alternate timelines: just this: imagining redoing the travesty that is the Harry Potter finale.
  • Serial writing: just like TV shows, writing doesn’t have to be published all at a go. Amateur authors regularly publish a story a chapter at a time.

Having said that, the proof of the pudding is in the reading: these are five good stories written by unpublished authors that I’ve read over the years:

Delenda Est: a Harry Potter and Bellatrix Lestrange (!?) alternate timeline story, an incredible pairing that somehow Lord Silvere manages to turn into an amazing story. Lots of plucky Bella here. This could have the Bella all of you panted over.

The Book: a mind control story: what happens when a middle-aged guy picks up a book that has symbols in it that gives him the power to control other people’s minds? Blackie weaves a tale that has the usual vagaries that come with mind-control, but also a surprisingly intelligent & coherent story behind it.

The Adventures of Me and Martha Jane: probably one of the books that has influenced me the most with its ideas of love, longing and blurred relationships, SJR’s Martha will stay with me forever. Forget the warnings on the page and dig through the story, worth it.

HPMOR: who says Harry Potter has to be irrational? LONG, good Harry Potter fan-fiction.

Sennadar: this should have been published. It’s as good, or if not better than a lot of what goes for fantasy now. A classic coming-of-age fantasy novel, but with a were-cat twist.

And that’s just the tip of the iceberg. I’m both excited & worried about the future of amateur writing. Excited, because efforts such as Amazon’s Kindle platform have provided a level playing field for a lot of budding writers out there. And worried because while the platform is egalitarian, it’s also commercial, which means a ton of potential authors will write stories in genres that are more popular & more acceptable. Amateur fiction until now has never been about making money, it’s about an author slogging away at a computer somewhere writing something that just needs to be written. But maybe that’s just the old-man in me speaking. There has never been such a time where there’s been such a low distinction between a “published” author and an amateur one. That, in my books, can only be a good thing.

The Malayalam Movie Comeback

Thoovanathumbikal

A big loss as a movie enthusiast was that when I was growing up, the Malayalam movie industry was in a formulaic rut. While local movies still used to bag critical awards, most “popular” & “commercial” films casted Mammooty, Mohanlal (please don’t click that link) or other A-list stars and had the usual Dhamaka Dishoom fare. The movie revolved around the actor, who always had labels that were prefixed with Super, Mega, Ultra & so on, and the films resembled caricatures of superhero movies.

It’s only much later & very recently, when Uma & her friends introduced me to an older generation of movies that I… simply lost my breath. Padmarajan & his colleagues’ movies were stunning in its breath & simplicity and casted real people who portrayed everyday situations but in a fashion that elevated the craft to art. These movies didn’t shy away from controversial topics: pre-marital sex, rape (not the dastardly erotic treatment of it that came later, but a serious examination), family trouble, relationships that didn’t have labels, what not. Even if you don’t understand Malayalam, I urge you if you are a movie enthusiast to watch subtitled versions of some of Padmarajan’s classics: perhaps Thoovanathumbikal (Dragonflies in the Rain), or Namukku Parkkan Munthiri Thoppukal (Vineyards for Us to Live). And this in the 80s! What happened to Malayalam cinema in the 90s, in contrast, can only be described by one word: tragedy.

What I see now, perhaps starting 2010, is a revival. Talented new directors have emerged who have broken the deadlock an older generation of actors had on the industry. Perhaps, more importantly, there have been several good movies, and dare I say, a few great ones. These directors are courageous as well, and have tackled contemporary topics head on: 22FK was the first movie that I saw in a Kerala theatre where a female lead received a standing ovation (regardless of its contentious conclusion, the movie can’t be faulted for not having courage); other movies such as Manjadikuru & Philips and the Monkey Pen give a serious treatment to a previously unexplored genre.

I’m cautiously optimistic about this “new wave” of Malayalam cinema, more so because unlike Padmarajan and his coterie, there seems to be a broader cross-section of directors, producers and artists this time around:

  • A lot of women in control: Ustad Hotel is written by Anjali Menon & the Monkey Pen is produced by Sandra Thomas, amongst others.
  • A ton of good actors: the new heartthrob Fahad, Malayalee’s serial kisser, who has made even baldness cool (beat that Bollywood!), a 50-something Lal who acts his age & who honestly has been a revelation for his portrayal of characters on screen, and good & refreshingly sexy female leads like Shwetha Menon, Rima Kallingal, Nazriya, et. al.
  • Some “counter-culture” movies, like Kili Poyi, or Idukki Gold, where the actors openly smoke up.
  • And the fact that this energy has infected the older actors as well and brought out their long dormant mettle, see Drishyam for a long awaited director’s movie starring Mohanlal, and Mammooty was brilliant in Pranchiyettan.

Instead of more reviews & critiques, let me leave you with a list of movies you can watch yourselves & decide. Here’s to a great future for Mallu cinema!

Download a plain text Wikipedia links to these movies here.

[For people in Hindi-land who are reading this and becoming bewildered, yes, there's a better movie industry outside Bollywood, and no, for a lot of us Indian cinema doesn't revolve around SRKs, ABs, PCs and their tribe. This is your opportunity to upgrade to a better experience, take it!]

Relief India Trust, the Scam NGO, Files a Legal Notice Against my Blog Post for Defamation

Relief India Trust Google Reviews
From when I wrote my last blog post on Relief India Trust, there have been consistent efforts to take my site down through dubious legal means, the latest of which is a defamation legal notice against me by the Relief India Trust folks. Here’s a factsheet:

  • I received a mail on 9th February, 2013, and also a ton of calls earlier to this date asking me to donate to a charity named Relief India Trust. The people on the call were extremely pushy and while initially I thought it was because of the urgency of the case, I quickly became suspicious. When I started asking them about their NGO registration number, their 80G registration etc. they hung up the phone and stopped responding.
  • I wrote the blog on 13th of February, and it immediately started accumulating comments from lots of folks who had similar experiences.
  • I forgot about all of this until somebody else at the organisation called me up again. I replied that I knew this was a scam, and they hung up immediately.
  • In June, I received a call from a journalist working for The Sunday Guardian asking me further details about the Trust. She also visited their office and informed me that there weren’t any kids there, “only a lady who gave us a long brief about the ngo and so forth”.
  • August 5, I received an email from a person claiming to be from RIT and claiming that somebody else was calling on their behalf. I don’t understand how this can be the case because they ask us to donate by going to the same website. Unfortunately I was so disgusted that I did not reply to this message. That mail is available here.
  • In September, another person emailed me saying they had received a take down notice.
  • In Nov of 2013, I noticed that my blog was not loading. When I contacted my hosting provider (wordpress.com), they said my site was under a DDOS attack. Please see my conversation with my hosting provider.
  • In the Jan of 2014, they filed a DMCA complaint against me with my hosting provider and took down my site until I could file the counter notice. As many of you would know, DMCA notices are used to report copyright violation, so this was definitely a tactic to take my site down.
  • The latest action from their end is this threatening legal notice I received this week.

India’s defamation laws aren’t exactly the most progressive, but in the interests of not letting such dubious “Trusts” get away with anything, I asked around on Twitter for any lawyers willing to provide some legal advice. Anoop reminded me about my school senior Aju who now runs MyLaw.net and he in turn connected me to Apar of Advani & Co who is now representing me. My initial interactions with Apar were through Twitter, and he’s been a pleasure to talk to. Find Apar’s response here, wherein he manages to quote Arkell vs. Pressdam :-)

Will keep you folks updated.

Only then do we sign our work

apple_measurements

Yesterday at WWDC, Apple unveiled a mission statement that reads more like a poem. There’s a wonderful video as well, but just look at these words:

If everyone is busy making everything
how can anyone perfect anything?
we start to confuse convenience with joy
abundance with choice.
designing something requires
focus.
the first thing we ask is
what do we want people to feel?
delight
surprise
love,
connection.
then we begin to craft around our intention
it takes time…
there are a thousand no’s
for every yes.
we simplify
we perfect
we start over.
until every thing we touch
enhances each life it touches
only then do we sign our work:
Designed by Apple in California.

It’s a wonderful & sincere expression of why we need Apple to exist. And why I love this company so much. I’m beginning to think that the Apple under Jony Ive & Tim Cook might become far better than under Jobs.

VoiceTime, or a future in which Apple can take over VoIP

FaceTime. iMessage. VoiceTime?

Apple has FaceTime, and it has been very successful in introducing simple video calling to a ton of Apple users. That success—as Apple’s usually are—can be attributed to clear directed marketing and unparalleled ease of use. If the person on the other end of the line has an Apple device too, the only thing you need to do is press a FaceTime button and then seconds later, be talking to them via a video call.

I know a friend of mine who was super surprised when he pressed the button and it worked first time. He expected more of a setup process, perhaps creating an account or entering user credentials. Or heck, maybe being taken through a wizard. It’s the same user flow with their SMS counterpart—iMessage: you text a person and if the other end is an Apple device, the text is delivered via the iMessage IP network. It’s not just Apple who is doing this of course: a classic example is iMessage’s cross-platform and much more successful rival: Whatsapp. If you have anybody’s mobile number and if they have Whatsapp installed, you can text them. Works amazingly well!

So with FaceTime, Apple has video over IP. With iMessage, they have a simple texting solution over IP. What if Apple makes a dead-simple, works over the same mobile number, voice calling service?

First off: some historic background:

  • Apple first introduced the iPhone in June 2007. When it did so, it broke the stranglehold US carriers had over mobile phone vendors: AT&T controlled neither the production timelines of the device, its content, nor its software updates. It even bent over backward to refit its voicemail service to support Visual VoiceMail.
  • In July 2008, Apple introduced the App Store. The App Store further diluted the participation that carriers had over user mobile phone experience. Instead of subscribing to mobile VAS services, users could now download and experience much richer mobile apps: apps that could work offline, would work over the internet, and apps that didn’t add any extra charge to their mobile phone bill.
  • In June 2010, Apple introduced FaceTime, a seamless way for any iPhone user to make video calls, effectively destroying any plans that carriers had for charging premium for video calls over their network.
  • iMessage was introduced in June 2011, and is continuing to deliver 28,000 messages/second that would otherwise have been chargeable text messages.

In short, in a span of less than six years, Apple has steadily worked to decrease the control carriers had over mobile phone user experience. It has reduced the importance of core apps (text messaging), and provided a better experience for premium apps (video calling, mobile internet, apps, et. al.). What if the next milestone is to do to voice what it did to video and texting?

Let’s explore another tangent. Why is Apple best placed to do this first? Android for example looks on the surface like a much better candidate. Android is an open environment and there are tons of apps that provide deep hooks into the platform to provide free (or close to free) Voice over IP. Tango is a great example of such an app. There are several reasons:

  • Apple has always done what it thinks is best for its users. Android was conceived to build the best mobile phone operating system for carriers: it was unveiled alongside the now largely defunct Open Handset Alliance as a countermeasure to iOS popularity.
  • Android has always put carriers in control. Carriers control the content of the phone and they can customise Android to a large extent, and because of such customisation, Android updates are slower to roll out.
  • It’s a better operating system than Symbian, but the core philosophy is the same. Put carriers in control, give them a smartphone OS choice largely equivalent in feature-set to iOS, but with none of its Apple control.
  • Google has never developed a competing  phone number based video-calling or text messaging solution. Hangouts—which is their latest unification of text and voice—asks for & confirms your mobile number but then does nothing obvious with it.

What is the bare minimum experience that Apple needs to provide to make VoiceTime a success?

  1. First and foremost, it needs to be transparent to the user. If I dial a contact and I’m connected to the Internet via WiFi, 3G or LTE & the receiving device is an Apple device, the call should be routed via VoiceTime. No fuss, no setup.
  2. If I’m not connected to the Internet, the call should be routed via cellular with the same experience. Just like iMessage, the dialer can subtly indicate (via a change of colour perhaps) that I’m on a cellular connection instead of VoiceTime.
  3. Third, and another important one: transparent handoff between cellular and VoiceTime calls. That is to say: if I start a call on VoiceTime & my Internet disconnects, the call is handed over to a cellular MSC without a call drop.

The third is the most technically intractable problem. But like every open problem, somebody more intelligent has probably thought of solutions. In this case, IEEE 802.21, specifically describes this problem. And several workable solutions.

Why hasn’t anybody done this yet? Simply because mobile operators don’t want to lose the last bastion of money making they have. In India, despite have the lowest call rates in the world, voice calls contribute the most to subscriber ARPU. Nobody is looking for a blue ocean strategy when their existing business is doing just fine, thank you!

But why do you, as a subscriber want VoiceTime or an equivalent solution? There are so many reasons:

  1. It’s stupid to pay thrice: once for your data plan, once for your voice minutes, and again for your texting. The mobile world should be like your internet world. You pay once, for data.
  2. The Internet Protocol has proven to scale as much (or more) than any telecommunications network. The questions of reliability that led to the creation of dedicated protocols for voice telephony largely don’t exists any more. Indeed, the next generation of telephony networks (SIGTRAN and LTE) are based on IP. If the future is a data driven Internet, why are we still living in the past?
  3. The reality is that voice calls over cellular networks suck. Frequent call drops, cross connections, terrible voice quality and lack of any presence features is not what you would expect in 2013. Skype-like voice quality, a network more resistant to call drops and integration with the larger Internet & other devices is just an implementation away.

Would you be willing to pay Apple for such a technology? Let’s say Apple charged something like $10 a year for VoiceTime & let carriers have 70% of it (Voice is just another app of course), would you pay?

Literate Coffeescript is sweet. It’s a great implementation of something that I’ve always thought is fundamental to good programming: we write code for ourselves and other humans, not for a computer. We’re no longer dealing with slow-as-molasses computers that demand efficiency, but we have a lot of additional complexity to handle, a ton of language feature creep, concurrency nightmares and multiple service architectures. By making explanation the fundamental purpose of a programming language and giving it first-class status—and equally important—running indented code seemingly as an afterthought, Literate Coffeescript gives importance where it’s due. Think before you code, write down what your intent is. And make programming seem a lot more like crystal clear writing.

OS X

I think it’s popular now to criticise Apple, and particularly Macs, but there’s one simple reason that I switched to Macs and have been using them ever since:

I was tired of spending time on my computer working on my operating system instead of working on my projects.

Do read: 25 Years to Mac – How Ubuntu Pushed Me Away from the PC. I do go back to Ubuntu every now and then in Virtual Machines, but it doesn’t come close. And Windows is not *nix, so that’s never an option.

I think what Google should have done was build their own version of Linux, just like they built Android inspired from iOS. I bet lots and lots of folks would have paid money for a stable, usable, friendly & Googley Linux. And stuff like their Pixel laptop could’ve done more than just run a browser.

Delivering a Quick and Quality Software Product

 

There’s always a tradeoff between time, money and quality in software development. More often that not nowadays, product managers are asked to deliver excellent products on a tight schedule and on a budget. Agile processes and the trendy idea of the minimum viable product only confuse things further because they seem to prioritise quick iterative development and a tight rein on features. How does one build a good product on time and keeping startup thriftiness in mind? I have six clear steps for you to follow:

One: Before building anything, dig to the roots and find product goals. Ask your client (or yourself) “Why do you need this product built?”. Answers like “We need to run a twitter campaign” or “We want a CRM for our sales team” are not clear enough. “We need to run a twitter campaign to promote our new startup” & “We need a CRM that tracks and updates senior managers about daily sales” are better answers, but ideally what you’ll dig for and get is the true reason behind the product’s existence. For the twitter campaign, it could be “reaching out to potential customers who are likely to purchase our startup’s product”. For the CRM, it could be “improving sales performance and making our sales team more accountable”. You are digging for goals—the why—not features—the what—that comes later, and that’s less important.

What the Customer Wanted

Two: Minimum viable product is a great idea. But when choosing what to whet down, keep the goals in mind. I have little patience for managers who come back and tell me, “But isn’t this what we decided to build?”. If you can’t deliver what your clients need, then you aren’t doing a good job. Keep product goals in mind, and prioritise features that will provide the most bang for the buck. Only then will Agile and MVP work well for you.

Three: Find great developers. What’s more important than your process are people. Great developers can manage themselves, figure out their own deadlines and do a lot more than a mediocre developer. Give them great tools too: a good working space, the editor that they want, and their operating system of choice. It’s also much better when you have a good team to split work into spheres of responsibility rather than into discrete chunks where many developers work on the same piece of code. Adopt practices like pair programming & SCRUM and promote discussion within the group. If at all possible, guide every discussion to a consensus and make sure every developer is aware of potential tradeoffs.

Four: Try as much as possible to not build engineering debt. This is easier said than done, but few rules go a long way: adopt a good source control system and practices, a development, staging and production deployment for every product, a continuous integration and deployment server, and have a system in place to code-review every commit before it gets merged back into the main tree. Make your developers police each other.

Five: Have an independent team to do security audits & quality assurance. This is a “manual testing” team, and while they might use several different tools, their mandate is to make sure the product has the features necessary, doesn’t have bugs, and is secure from a security standpoint. This team can be external to the company as well, but it needs to build a good rapport with the engineering division so that they don’t work at odds.

Six: Have regular deployments or sprints. Nothing focuses a developer more than a predictable schedule of deployment. Divide chunks of work into sprints and have staging deployments between them. Make sure that developers are responsible for successful deployments and pour woe to the person who breaks the build.

Well, those are the rules. Sticking to them is hard, and at times impossible, but my experience is that any deviation has always resulted in a loss of quality. While that might be acceptable in the short-term, in the long-term, it always comes to bite you on your ass. However, being a product manager is one of the best jobs out there! Figuring out the nitty gritty details of product construction, overcoming obstacles, finding consensus within the team and outside, and finally, building a product that meets the requirement is a very rewarding job. I’ve done a few stints here at MobME building good products and I’ve loved every challenge.

Stanford Developing Apps for iPhone and iPad is Awesome

Update: I’ve started a Github repo that has completed assignments.

I’ve mentioned this before countless times to friends: the Stanford course on iOS Development is really mind-blowing. It’s the best way to learn iOS and Objective-C: Paul Hegarty is a wonderful teacher, the content & density of the slides is excellent and the complex Cocoa Touch framework and the Xcode 4 development environment is brought out very well through hands-on coding sessions.

MVC on iOS

Just to take one example, take a look at this slide above detailing MVC in iOS. I can point out dozens of developers who wouldn’t have this condensed understanding even after months with the ecosystem. In this small slide, you have:

  • Models, Views & Controllers with road lines between them depicting the fact that models and views never talk to each other (double-yellow line), controllers always talk to models and views (dotted white) and when models and views need to talk to controllers, they do so in very defined ways (solid white).
  • When views need to talk to controllers, they either fire an action arrow into a controller target, or delegate willshould, and did methods to the controller, or when they want data from the controller, set up a datasource and ask for data at & count methods.
  • When models need to talk to controllers, they set up a radio station and broadcast notifications that controllers then tune into.

And this is just one slide in the introductory lecture. It’s a wonderful time to be a self-taught app developer. The Stanford course is right now ongoing and lectures are being updated on iTunes U. There’s a Piazza discussion forum as well for the course.

Relief India Trust is a Scam NGO, Do not Donate!

relief-india-trust-fraud-scam

So I recently got a bunch of calls from an NGO calling itself Relief India Trust. The calls themselves were a bit odd and over zealous with the volunteers flinging themselves at me and describing the plight of children in gory detail, but it was the frequency of the calls that first gave me a clue that something wasn’t right. Then, when I finally decide to put in some money, there’s a volunteer who calls me up, takes me through the donation process and then offers to stay on line until I give her a Transaction ID for the bank transfer.

I did up a Google search and found tons of consumer complaints about the organisation. And if you notice their NGO registration number in the link above, it’s just a plain four digit number: 3696. The Indian Government has an NGO search site where you can search for voluntary organisations by name or their unique registration ID. 3696 doesn’t turn up Relief India Trust. Neither does a search for their name. These are a bunch of scam artists.

Real NGOs probably needn’t be this aggressive. I donate every year to SOS Children’s Village India, which has been my charity of choice since forever. They have a unique take on bringing up disadvantaged children, a personal dialogue (if you wish it) between your kid and you, and I’ve visited their centre in Kerala too.

Update: I’ve receive a legal notice from Relief India Trust. Further developments in a new blog post: Relief India Trust, the Scam NGO, Files a Legal Notice Against my Blog Post for Defamation

Silver Linings Playbook

Image

Silver Linings Playbook is a really good movie. I’m a complete sucker for good romantic flicks, so this was no surprise, but the story around two psychotic folks finding each other was well done. It could’ve been better slightly, a little bit more edgier perhaps, but it’s still really good. And oh, Jennifer Lawrence is hawt!

Analysing IRCTC Traffic

So it needn’t be said that IRCTC is sluggish at the best of times and at the worst, darn unusable. Is it because of astronomical load? Bandwidth issues? Technology mismanagement? Resource exhaustion? Let’s take a look and try to guess.

This news article from July 2012 mentions that IRCTC did 4.47 lakh bookings in a day. Doubling that figure to get current estimates makes it 10 lakh bookings a day. To get the number of folks accessing the site trying to reserve tickets, let’s multiply that figure 10x. That makes it 100 lakh folks (i.e. 10 million) as a guesstimate who will try to reserve tickets using IRCTC in just a single day.

The first question to ask is are those many tickets available? The answer is an emphatic no. The Centre for Railway Information Systems that develop CONCERT, the backend engine behind web bookings only does around 2.2 million reservations a day. So there’s no way IRCTC can satisfy 10 million users. We don’t have that many seats to reserve.

The second: is 10 million visits a huge number? This is a tough question to answer because it depends quite a bit on what those folks are doing. On most days, the IRCTC website homepage is accessible and loads up reasonably quick, but when you try to login, search or book, it falls down flat. So most likely bandwidth exhaustion is not the root cause, because 10 million visits is just about 11 visits a second. That shouldn’t translate to a lot of bandwidth. Besides, even 100Mbps+ pipes are cost-effective for an organisation the size of IRCTC.

My theory is that the problem likely is the CONCERT backend. That’s not built to handle this kind of a consistent load, especially at peak Tatkal hours when traffic per second could be much more than what I’ve estimated above. And CONCERT was probably never built with online reservation in mind; it was built for a world where people queued up physically, talked to a railway official and booked tickets. And neither CRIS nor the Railways would want an immediate revamp. So what can IRCTC do about it?

There is a very simple solution: decouple the user experience from the backend booking system. I’ll explain what I mean in screenshots:

IRCTC Wait 1

Step 1: User goes to IRCTC website, logs in and is taken to a page that has a queue number and an approximate wait time on it. This queue number is the position in the queue and is alloted on a first-come first-serve basis and resembles a physical queue where folks wait to book tickets via CONCERT. The page auto-refreshes and updates this information periodically.

IRCTC Wait 2

Step 2: When the user reaches the front of the queue, he’s taken to the search & booking page. IRCTC perhaps takes up around 500 customers in a minute this way (or however much CONCERT can reliably handle) & booking proceeds as usual.

Advantages

  • IRCTC booking is no longer a lottery. This is perhaps the biggest win.
  • Customers who log in bright and early get the first slots.
  • Late customers get a higher queue number and won’t get tickets, but IRCTC can always blame the other people ahead of them in the queue and not the faulty technology for it.
  • The queueing system is optional when traffic is low and below CONCERT’s reliability threshold.

As I see it, if there’s no way to work around limitations of older software, the only thing you can do is remove customer frustration. And it’s high time they did something to improve the Tatkal madness.

Hat-tip to Anoop for making me dig deeper into this.

OSX Encrypted Disks & Passwords

So if you’re like me and have a ton of different passwords to manage but don’t know how to securely store them, here’s a neat trick: first off, don’t use password managers that store your data in proprietary formats; god forbid if you ever someday need to change platforms or store something more than just “passwords”. Use a simple plain text file and pick Markdown to organize data. Here’s how mine looks:

Credentials File

Now, use a little known facility in OSX to create encrypted disks. Type ⌘+space & open Disk Utility. Now, File > New > Blank Disk Image. Under Encryption, choose 256-bit (why not?) and for image format, choose sparse disk image. Choose a place to save and you’ve got your own secure storage.

Encrypted Disk Image

Drag and drop your Credentials markdown file into your disk image and you’ve got your own secure password store. If you’ve got any backup tools like Time Machine or Backblaze installed, then it’ll pick up this file too so your passwords are backed up. Have fun!

Here’s a small nod to Backblaze. They’ve been managing my offsite backup needs for over 2 years now with no complaints. It works much like Dropbox, which is probably the biggest compliment I can give them.

Shell Apps and Silver Bullets

Really great article from Ben Sandofsky that mirrors some of my recent thinking around Phonegap:

The difference is shell apps come from the wrong mentality. They start from, “How do we reduce effort?” instead of “How do we deliver the best product?”

The Adobe Turnaround

Three years ago, Adobe was a desktop publishing company heavily invested in proprietary tools. It had great desktop image publishing software with Photoshop, top-of-the-line rich content creation for the web with Flash and good and accessible document sharing for the desktop with Adobe Acrobat.

And then Steve Jobs decided to wage a war on Flash. And in a testament to how fast things can change in the IT world, Adobe suddenly looked to be in trouble: an aging software company that didn’t have a cloud solution, wasn’t doing anything for mobile media creation and didn’t have any out-of-the-box solutions for creating mobile apps. Everybody else seemed to be moving data to the cloud, writing mobile applications that were far more interactive than anything previously available and moving on from static web content on the desktop to richer desktop-like web applications.

For a while, Adobe seemed to flounder. Like any company faced with the innovator’s dilemma, it tried to double down on its roots and extend its older software in ways it had never been written to do. Around three years after Adobe promised to port Flash over to mobile devices, it finally had a version that worked “well enough”. But after harping on the “open” bandwagon for quite a bit (with partners who seemed to support it half-heartedly no less), somebody at Adobe finally took a long, hard look at the Steve Job’s letter in which he summarizes everything that’s wrong about Flash (and about Adobe).

And the new Adobe is awesome. It’s a company that has woken up to what it does best: create great tools for web (and mobile) developers. Dreamweaver 5.5 significantly improves HTML5 support, works with mobile browsers and supports jQuery out-of-the-box. Adobe picked up some nifty technologies along the way too: it acquired Nitobi, makers of the top-notch Phonegap application that gives it a great foothold in mobile webapp creation (which in a stroke of genius, it then promptly submitted to the Apache foundation and ensured proper stewardship and great continuity). A similar Typekit’s acquisition gave it a commanding position in the web type foundry world.

The turnaround point probably came with the announcement to ditch Flash for mobile devices. That took serious balls and it was from an Adobe coming to the realization that it had to give up the Flash empire. But John Nack puts it really well here:

“When the oak is felled the whole forest echoes with its fall, but a hundred acorns are sown in silence by an unnoticed breeze.”

And those acorns are starting to sprout. Adobe has great new HTML5 development tools in the works, a growing community around web standards, and software that’s looking great on the mobile, desktop and on the web.

But most importantly perhaps, with today’s Creative Cloud offering, Adobe has shrugged itself off its desktop software roots. A more accessible subscription model means that customers used to paying less than a buck for an app will find Adobe’s pricing much more palatable now. And it’s working to innovate on the cloud as well, offering a full suite of creation, storage, sharing and publishing tools.

Adobe is poised to do great stuff. Happy for them!

The Holstee Manifesto, Up On My Wall

I took one look at the Holstee Manifesto and decided I wanted it right then & there.

Here’s what it says:

This is your Life.

Do what you love, and do it often. If you don’t like something, change it. If you don’t like your job, quit. If you don’t have enough time, stop watching TV. If you are looking for the love of your life, stop; they will be waiting for you when you start doing the things you love.

Stop over analyzing, life is simple. All emotions are beautiful. When you eat, appreciate every last bite. Open your mind, arms, and heart to new things and people, we are united in our differences. Ask the next person you see what their passion is, and share your inspiring dream with them. Travel often; getting lost will help you find yourself.

Some opportunities only come once, seize them. Life is about the people you meet, and the things you create with them so go out and start creating.

Life is short. Live your dream and share your passion.

And it says this in beautiful old school typographic print. I ordered one, had it framed in classic black and it now hangs on my wall to be the first thing I see when I wake up.

Remove ugly header from your WordPress.com page

You’ve probably seen the ugly dark banner on top of your WordPress.com blog when you’re logged in (and sadly, when others are logged in). If you’re willing to pay for the custom design store perk, here’s how to remove them.

Insert this in Appearance » Custom Design » CSS.

#wpadminbar {
	display:none;
}

Ergonomic Analysis of Doors Connecting Malet Place Engineering Building to the Roberts Building

(This is unmarked coursework, part of my HCI course at UCLIC and is released with permission from Prof. Rachel Benedyk. Since this is not evaluated and hasn’t gone through any sort of peer review process, it will most certainly contain errors.)

This analysis uses static anthropometric data to find out inconsistencies of door measurements with the stature of the intended population and recommended positions of door artefacts such as push handles and see-through windows. It further uses dynamic anthropometry to find out if the restoring torque of the door is within recommended limits. The most common use case of a healthy adult is considered in detail as the target user group and the lack of wheelchair accessibility is also noted.

Door Dimensions

Figure 1: Approximate Measurement of doors

Static anthropometry does not reveal faults with the door width and height. According to data available, the width of the door at 93 cm (all measurements are described in detail in Figure 1) and height at 240 cm is adequately wide and high for the 95th percentile of the tallest man who at 180 cm tall (Pheasant 2003) serves as the limiting user. The recommendations for the door handle or push plate state that it should be 25-35 cm from the door edge and 100-150 cm above the floor (Chang & Drury 2007)—and we see that both the push plate and the door handle at 8 cm from the door has inadequate distance from the door edge. However, both have the requisite span since they are long vertical strips that have a length of 65 cm, making their total effective area close to 145 cm, well within the recommended space. Analysing the placement of the see-through door windows, a flaw is immediately obvious. In the case of a short woman as a limiting user (shorter than 140 cm), the window would be useless since it will not allow her to see an intended user on the other side. The major limitation of the static anthropometry method was that it did not consider the purpose of the door: i.e., it hypothesised a theoretical user and did not analyse the function of the door—to open and close and lead the user through.

We decided then to take a step back and use the doors ourselves and note down psychophysical observations. We also observed other people using the doors and noted down possible flaws in the door design. One problem was immediately obvious and unanimous: the doors were too heavy and couldn’t be pushed through easily. We noted an instance where a man walked in with a package, couldn’t use sufficient leverage to open the door with a single hand and instead kicked out with his leg to stabilise the door enough to move through. These observations made it evident that the heaviness of the door was a source of major ergonomic discomfort.

Taking into account Chang & Drury’s recommendations for restoring torque at 30 Nm, an analysis could be done after measuring the door width and the placement of the handle. However, the study was complicated by three factors: 1) the hinges on the door were badly in need of oiling, 2) the door had different resistances at different points in its axial opening and 3) the hinge was loose and at least a portion of the door was intersecting with the frame of the door causing added initial opening friction. Even ignoring these three factors, and calculating the torque of this door,

Moment = F r Sin a

Assuming a as 90 degrees, which means the user pushes the door at a right-angle,

Moment = F r

Since there was no way to measure the weight on the door, we assume a force ranging from 22 to 132 N as in Chang & Drury, moment at 22 N would then be: 22 N x 85 cm, which is 18.7 Nm. Assuming the upper limit of a 132 N force, moment would be 112.2 Nm.

We notice here that if the force on the door is beyond 35 N, we exceed the stated recommendations. The three subjective experiences noted above however, exceed any apparent weight of the door. The hinges of the door at the Malet Place end were so badly unmovable that they would not open beyond 70 degrees unless an exceedingly strong force was used. While any door is an interruption to the dynamic flow of walking of an individual, the successful doors try to stay out of the way by minimising resistance and being easy to open and close. The resistance of these doors makes even the strongest user take a cognitive break from his actions and use his will on the door to move through—this is especially evident when the user has his first experience since he does not expect the door mechanism to be so rusty.

Considering the operation of the door with normal users in mind, the biggest recommendation that can be made to ease the use of the door is to oil the hinges and position the door within the frame so that no part of the door impinges on the frame. The door could also be made lighter and conceivably transparent since it does not overlook into any sensitive areas. To ease wheelchair access, the door could be made powered. The placement of the door pads and the handle could be moved to be more conforming.

The doors would also probably require regular maintenance since they experience heavy traffic throughout the day. Because they connect two buildings with possibly separate and insulated heating systems, a heavier door might have been preferred, but the ergonomic cost add up after each use. Keeping in consideration cost requirements, an enclosed area between the two buildings could be constructed to serve the same purpose since the primary function of the door in this instance seems to be insulation.

While considering an evaluation of this nature, one thing that I thought could be done differently was to perform a contextual enquiry of people using the door—immediately after they went through the double doors. The users might be able to articulate what their material difficulty was, and provide clues to how to construct these doors better. The study could also have been better if it had more data to analyse: it would be relatively easy to set up video recording equipment and observe users interacting with the doors and analysing quantitative measures from the door as well, like the time it takes for a user to successfully complete the interaction. Perhaps grouped by gender, this can provide further insights.

(1047 words).

References:

Pheasant, Stephen (2003). Bodyspace. Anthropometry, Ergonomics and the Design of Work. Second Edition. p. 244.

Chang, Shih-Kai, Drury, Colin G. (2007) Task demands and human capabilities in door use. Journal of Applied Ergonomics Vol 38. pp. 325-355.

Heuristics as an Aid to Training a Usability Evaluator’s Expertise

(This is unmarked coursework, part of my HCI course at UCLIC and is released with permission from Prof. Ann Blandfod. Since this is not evaluated and hasn’t gone through any sort of peer review process, it will most certainly contain errors.)

Heuristics as an Aid to Training a Usability Evaluator’s Expertise

Vishnu Gopal

University College London

It is clear that heuristic evaluation as Nielsen envisioned it is a method meant for experts (Nielsen, 1992). Heuristics do not stand alone, and have to be moulded to fit any particular scenario: the general set of heuristics have been expanded into specific guidelines for different kinds of activities: accessibility, internationalization (Gonzalez, Granollers, Pascual, 2008), etc. are examples and these require evaluators who are trained and experienced in separate spheres. Experimental data also seems to suggest that the more experienced the evaluators, the more usability errors they find (Dumas and Redish, 2002). Given this scenario, I seek to explore if heuristics or other similar guidelines can serve as a tool to strengthen a beginning evaluator’s “experience”.

Without doubt, applying heuristics to usability evaluation gives a methodical structure to the task of analysing potential faults in a system. During a recent analysis of e-commerce websites, one major observation that I made was that without rules, it’s easy to miss the forest for the trees—i.e. one might speculate on possible faults: for e.g. the links on the right hand side navigation bar was not prominent or relevant enough, or the visual design was not attractive; but fail to gather the data into meaningful coherent suggestions. The aim, after all, of a successful usability evaluation is to find ways to rectify potential faults. When pitted against an evaluators raw instincts then, following a set of guidelines acts both as a reasonably exhaustive search space and a framework for assessing faults that have been found. It can also be argued that using a set of guidelines methodically can sensitise an evaluator to common errors.

On the other hand, I also observed that there might be faults found more easily not from a strict adherence to guidelines, but from an evaluator’s own prior experience. During the usability evaluation activity, my companion who is a trained visual designer found that much of the website’s apparent clutter was due to it not following a coherent “grid system” (see Chang, Dooley, Tuovinen, 2002). While it might be easy to slot this into either guideline 4 (consistency) or 8 (aesthetic design), it doesn’t cleanly fit into the heuristic framework provided, but is a crucial criticism nevertheless. I suspect that a strict adherence to guidelines without a broader background might harm rather than help a beginning evaluator’s progress, but this requires detailed investigation. It could also be too easy to be trained to look into a series of specific and common problems rather than try to evaluate a system based on its intent.

This is further imperilled by the fact that the minutiae of specific guidelines change often. An example of this debate is how the specific recommendation relating to websites displaying content above the fold (the initial viewable area) changed from 1994 to 1997, a short span of three years (Nielsen 1997).

When compared to other methods of user testing, heuristics pale further in this regard. They remove a vital component from usability evaluation: the serendipity (Stoskopf 2008) that observing a user adds to training an evaluator’s instincts. This is especially important in a field like usability evaluation where observation of real users continues to be stressed (Petrelli, Hansen et. al. 2004), and rightly so, for HCI evaluation has its roots in cognitive psychology and that is a science yet to attain adulthood (Miller 2003).

It would also be instructive to observe how HCI (and accordingly usability evaluation) is taught in University courses worldwide. Saul Greenberg of the University of Calgary remarks that a “fundamental tenant of HCI is that end-users should play an integral role in the design process” and that “performing usability studies in class hammers home the relevance of evaluation”—indeed his course description (Greenberg 1996) is filled with references that directly involve users in class. Interestingly, the course is structured so that “Designing Without the User” is a later event: where lessons learnt from these evaluations are then integrated to try to formulate a theory of user behaviour.

Chan, et. al. exploring issues integrating HCI in master-level MIS programs also stresses the emphasis on users and “empirical testing” and recommends a curriculum that largely ignores heuristics. Faulkner and Culvin in “Integrating HCI and Software Engineering” condenses it well and also explains a crucial difference with software engineering:
“Some HCI practitioners seem to believe that if HCI can be reduced to guides and checklists that anyone can apply to anything, then all will be well. This is tantamount to designing HCI out of software engineering as it is providing rules to be followed without the requisite theoretical under-pinning. Students trained in this way will be chanting mantras and will be woefully unable to deal with problems that have not been solved elsewhere or are not covered by style guides and checklists. Software engineers on the other hand are either keen to embrace these checklists or are unwilling to accept that the age of users having to adapt themselves to systems has gone. Users want systems to work for them and not the other way round.” (Faulkner & Culvin, 2000)

Furthermore, in a study examining how guidelines and patterns might be effective in HCI teaching, Hvannberg et al. “found very little hard evidence” supporting the importance of using patterns or guidelines in HCI teaching. However, they also noted “a desperate need to conduct studies on a suitable scale on the use” of patterns and guidelines in teaching HCI concepts.

There is no doubt that Nielsen’s basic heuristics have stood the test of time as a way to find usability errors. However, as a tool to train a beginning evaluator, they should certainly be supplemented by other evaluation methods.
(1045 words).

References:

Chan, S.S, Wolfe, R. J., Fang, X. (2003), Issues and strategies for integrating HCI in masters level MIS and e-commerce programs, International Journal of Human-Computer Studies, Volume 59, Issue 4, Zhang and Dillon Special Issue on HCI and MIS, October 2003, Pages 497-520, ISSN 1071-5819, DOI: 10.1016/S1071-5819(03)00110-1.(http://www.sciencedirect.com/science/article/B6WGR-4938JRM-1/2/498d2855a23d35c7524dd9c4201b5d4e)

Chang, D., Dooley, L. Tuovinen, L.E. (2002), Gestalt theory in visual screen design: a new look at an old subject, ACM International Conference Proceeding Series; Vol. 26 Proceedings of the Seventh world conference on computers in education conference on Computers in education: Australian topics – Volume 8

Dumas, J.S., Redish J. A Practical Guide to Usability Testing. (1999), Oregon. Intellect Books. pp 67.

Faulkner, X. Culwin F. (2000), Enter the Usability Engineer: Integrating HCI and Software Engineering, ACM SIGCSE Bulletin

González, M.P, Granollers, T., Pascual, A. (2008), Testing Website Usability in Spanish-Speaking Academia through Heuristic Evaluation and Cognitive Walkthroughs, Journal of Universal Computer Science, vol. 14, no. 9.

Greenberg, S (1996), Teaching Human Computer Interaction to Programmers. Technical Report 96/582/02. University of Calgary.

Hvannberg, E.T., Read, J.C., Bannon, L. Kotzé, P. & Wong W. (2006), Patterns, anti-patterns and guidelines: Effective aids to teaching HCI principles? In, Inventivity: Teaching theory, design and innovation in HCI – Proceedings of HCIEd2006-1 (First Joint BCS/IFIP WG 13.1/ICS /EU CONVIVIO HCI Educators Workshop (pp. 115–120). Limerick, University of Limerick.

Miller, G. A. (2003), The cognitive revolution: a historical perspective. Trends in Cognitive Sciences, vol.7 no.3.

Nielsen, J. (1992), Finding usability problems through heuristic evaluation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Monterey, California, United States, May 03 – 07, 1992). P. Bauersfeld, J. Bennett, and G. Lynch, Eds. CHI ‘92. ACM, New York, NY, 373-380. DOI= http://doi.acm.org/10.1145/142750.142834

Nielsen, J. (1997), Scrolling Now Allowed. Blog post at http://www.useit.com/alertbox/9712a.html

Petrelli, D., Hansen, P., Beaulieu, M., Sanderson, M., Demetriou, G. and Herring, P. (2004), Observing Users – Designing clarity a case study on the user-centred design of a cross-language information retrieval system. Journal of the American Society for Information Science and Technology, 55 (10). pp. 923-934.

Stoskopf, M. K. (2008), How Serendipity Provides the Building Blocks of Scientific Discovery. ILar Journal. vol. 46.

Follow

Get every new post delivered to your Inbox.

Join 4,734 other followers