Movie Review: The Hundred-Foot Journey

Tonight we saw the film adaptation of The Hundred-Foot Journey, based on the novel of the same name. Several weeks ago I heard a brief description of it and thought it sounded agonizingly boring. Last week I read a longer description on the theater website, and the Wikipedia article about it, but neither made me think I would enjoy it. I finally saw the trailer preview and thought it looked potentially better than I was imagining, and, being in the mood for some popcorn and a movie, we decided to see what it was like.

I’m really not sure what they could have done to promote it better, but this was one of the most interesting and well-produced films I have seen in a long time. Summarizing the plot here seems useless, since my own reading of plot summaries did not make me want to see the film. So I’ll just recommend that it is worth seeing, much more than one might suspect.

The story was surprisingly unique, yet familiar; the characters were believable, and the acting neither too eccentric nor too mundane; the cinematography was first-rate; the score masterfully underlined each scene without drawing too much attention to itself.

You might be able to still see it in a theater for a few more days, but otherwise it will be available on DVD/Blu-ray in early December.

How do people discover new books?

Today’s New York Times has an article by economist Paul Krugman about Amazon’s alleged status as a monopsony in the book-selling world. (I remember Krugman from college macroeconomics, but had to look up the definition of monopsony!)

There are opposing views in the saga between Amazon and publishers like Hachette, but even assuming that Amazon’s actions have been entirely proper, one aspect of the story in particular interested me. From the article:

Book sales depend crucially on buzz and word of mouth (which is why authors are often sent on grueling book tours); you buy a book because you’ve heard about it, because other people are reading it, because it’s a topic of conversation, because it’s made the best-seller list. And what Amazon possesses is the power to kill the buzz. It’s definitely possible, with some extra effort, to buy a book you’ve heard about even if Amazon doesn’t carry it — but if Amazon doesn’t carry that book, you’re much less likely to hear about it in the first place.

I don’t think that I have ever not found a book that I was deliberately looking for on Amazon, though I have no idea what books I never discovered at all simply because Amazon didn’t carry them. Amazon may be unlikely to break ties with larger publishers, but tiny publishers can put out excellent books too, and might be more likely to end up unfavored by the online retail giant.

With that in mind, are there any other book discovery habits besides searching on Amazon that we can cultivate?

  • Identifying favorite publishers helps in finding good books. I personally have a lot of books published by the MIT Press. Their books have consistently impressed me as well-written, well-edited, nicely-typeset, and nicely-bound on high-quality paper. Even when I buy one of their books on Amazon, I often find it first on the publisher’s website. Books from Sher Music are also consistently excellent.
  • While reading books and journal articles and magazine articles, I often make note of books that are referenced therein, and look those up as well. I keep several lists of books (currently numbering in the hundreds) that I tentatively want to acquire in the future.
  • There are some internet services that help people discover new books. We don’t necessarily know where these services get their book listings, so it still might come back around to whatever Amazon carries… but maybe not.

Society discovered and shared and bought and read new books for a very long time before Amazon became the de facto gateway to book shopping. If we deliberately choose to discover books through means other than searching Amazon, we certainly can do so, but I suspect that for many readers, Krugman will prove very correct. And to allow Amazon to wield such power seems kind of disconcerting.

Technology Obsolescence

A few weeks ago I decided to brush up on my iOS development skills and start planning a new iPhone application project. The latest version of Xcode required the latest version of OS X, neither of which I had installed, because my ancient copies of Avid Pro Tools 10 and Sibelius 6 were rumored to not be fully compatible with the newest OS X. Additional research suggested that, with the most recent minor updates on OS X, any incompatibilities would be non-issues for me, so I went ahead and upgraded to OS X Mavericks and installed Xcode 6.

Apparently just in time for Apple to release OS X Yosemite, which will surely come with its own set of potential incompatibilities with older software packages. Hopefully nothing will compel me to upgrade to Yosemite anytime soon…

While perusing the other latest announcements from Apple, I thought maybe setting up a Mac Mini as a dedicated music production computer could be a good idea: about as much power as my laptop, and I could do software development on the laptop instead, without concern of new OS updates from Apple wreaking havoc on the music production software. But wait! What happened to the FireWire 400/800 connection on the Mac Mini?

I guess I’ve been sleeping through the recent technical specification changes from Apple, as none of their current computer offerings come with built-in FireWire connections. Hopefully their FireWire to Thunderbolt adapter actually works, or I might have to replace my FireWire audio interface whenever I eventually get a new computer.

But why should I get a new computer? There was a time when it made good sense to replace your computer every two or three years, because over that time period, processor chips became significantly faster, and both memory and hard drive space became less expensive and more abundant. The basic computing specifications for today’s MacBook Pro are nearly identical to those of mine from four years ago. My computer is, essentially, still up-to-date. It’s mainly the software that is still constantly changing, with operating system updates driving updates in application software that you don’t necessarily want to upgrade just yet.

Technology obsolescence has been going on ever since technology has existed. I wouldn’t expect to still be using the video genlock I had on the Amiga 3000, nor the MIDI I/O card from the Apple IIe, nor the Borland C++ compiler that I used on Windows NT 4. But sometimes it appears that technology which is still perfectly up-to-date is being made arbitrarily obsolete, and that you don’t have to use your computer for much besides browsing the web and listening to music to experience it…

2014 Honda CR-V

After seven years and 92,000 miles we traded in the 2006 Honda CR-V for a 2014 model. What has seven years of design revisions given us? The most notable changes include a rear-view camera and corresponding display in the dashboard, which is automatically activated when you shift the car into reverse; a switch for turning on and off a mode for better fuel economy (which sucks resources from other areas of the car and makes acceleration take longer, so it actually does make sense to turn off the better fuel economy at times); and Bluetooth connectivity to your mobile phones / tablets / whatever, offering the ability to place phone calls through the car interface and play music from your portable device.

Since you can so easily play your portable MP3s through the car sound system, the 2006 six-disc compact disc player has been scaled back to a single-disc player. Unsurprisingly, the cassette player has disappeared entirely. I first bought an Apple iPod about ten years ago, and started using an iPhone six years ago, but have still listened mostly to CDs. After two days driving around with the iPhone-to-car-stereo interface, I can readily imagine that Honda will ditch the CD player altogether in a couple of years, and I for one won’t object.

A myriad of other smaller updates also make the car generally nicer: passenger doors that open wider; back seats that fold down automatically at the pulling of a lever; extended field of view on the driver-side mirror; a little bit more interior space, both for passengers and cargo.

The owner’s manual (p238) offers some good advice:

If you get stuck, carefully go in the direction that you think will get you unstuck.

Sound words for driving, and perhaps for life in general. And in case the temperature warnings on cups of McDonald’s coffee aren’t enough, the manual (p134) also advises us:

Be careful when you are using the beverage holders. Hot liquid can scald you.

But most importantly, how does Samantha the border collie enjoy the 2014 CR-V? Seemingly just about as much as she enjoys every other vehicle she has been in, which is to say, not at all. After sitting in the passenger seat for a few minutes, she much preferred going for a walk over going for a drive!

More: pictures on Flickr

PHP-FPM failures with Nginx

Following the news about a security flaw in GNU bash, I made sure to update everything on my web server. I had been using Nginx for serving web pages for over a year now, moving to it after many years of using Apache. After doing the various software updates on the server system, I began noticing web pages responding unusually slowly, and then eventually failing altogether, with a “502 Bad Gateway” error.

I traced the problem roughly to something odd with PHP-FPM, finding log results complaining about insufficient allowance for child processes and the service terminating. I had never seen anything like this before, and combed through any related Google search results to try to find a solution. I tried several things that appeared to make sense, but the situation did not improve.

Finally I reasoned that I had never seen this problem with Apache, and I’m finding lots of people who are experiencing this problem with Nginx. It does not appear to be a problem with Nginx itself, but perhaps with some interaction between PHP-FPM and Nginx. So since I really wasn’t doing anything Nginx-specific to run some web pages, I just switched back to using Apache. So far, all seems well again.

Writing software is hard. Writing software that always interacts correctly with a bunch of other software is even harder. I strongly suspect there is still something that I could do, either in editing configuration files, or in modifying source code, or in upgrading or reverting some piece of the puzzle, and continue using Nginx. But I have other things to do besides fiddle with web servers for a few days. Sometimes the easiest solution for a user is to just swap out one software component for another.

Designing Xerox Star Software in 400 Pages

I’ve been reading Bringing Design to Software, an ancient tome (published in 1996) that collects interviews with a dozen software practitioners on the subject of software design. In modern parlance, what was called “software design” in 1996 might overlap with what today is called “user experience”, but in any event, it is an activity related to, but separate from, programming, that results in a well-planned specification for what is to be programmed.

The first interview is with David Liddle, who worked on the Xerox Star, an early desktop computer system aimed at business productivity use. How did Liddle and his colleagues go about designing the Star software?

We ended up writing a 400-page functional specification before we ever wrote one line of code. It was long, because it included a screen view of every possible screen that a user would see. But we did not just sit down and write it. We prototyped a little bit, did some testing with users to decide what made sense, and then wrote a spec for that aspect. Then we prototyped a bit more, tested it, and then spec’d it again, over and over until the process was done.

400 pages of software requirements may be commonplace in specialized applications like avionics systems, but it’s a lot more planning than most software gets today. Not even content with that, Liddle’s team hired Bill Verplank, a human-computer interface expert from MIT:

Verplank and his crew did 600 or 700 hours of video, looking at every single feature and function. From all these video recordings, we were able to identify and eliminate many problems. For example, we chose a two-button mouse because, in testing, we found that users demonstrated lower error rates, shorter learning times, and less confusing than when they used either one-button or three-button mice.

Being on the front line of developing early office applications, Liddle also addresses the misconception that the software models of files and folders and desktops was meant to copy a real-world office environment:

It is a mistake to think that either spreadsheets or desktops were intended to imitate accounting pads, office furniture, or other physical objects. The critically important role of these metaphors was as abstractions that users could then relate to their jobs. The purpose of computer computer metaphors, in general, and particularly of graphical or icon-oriented ones, is to let people use recognition rather than recall. People are good at recognition, but tent to be poor at recall. People can see objects and operations on the screen, and can manage them quite well. But when you ask people to remember what string to type to perform some task, you are counting on one of their weakest abilities.

Curiously, a lot of software written for programmers to use puts heavy demands on recalling arbitrary strings of text…

Would it still make sense to write a 400-page specification for office application software today? Would it still make sense to record hundreds of hours of video to research the optimal way to use the software? Maybe not. Thirty-three years have passed since the Xerox Star, and along the way, many good software design concepts have been identified and established as common practice. If you’re building software for an Apple desktop or mobile platform, for example, you can simply follow Apple’s design guidelines and save yourself a great deal of fundamental human-computer interaction research.

Nevertheless, spending time to plan your application up front may still be a good idea. Thinking through the interaction experience and the needed functionality with a pad of paper and a pen can make writing the code more straightforward, and software is easier to test if you have a precise definition of what it’s supposed to do.

Thanks to people like David Liddle, we can draw on years of experience in good software design to get a head start on our own projects!

Canon 300mm/4 Lens from LensRentals

I needed a camera lens longer than anything I presently own for some event photography, and decided to rent a Canon 300/4 from LensRentals.

One frequently-cited advantage to using either Canon or Nikon SLR cameras is that they are so common that you can easily rent whatever you need that you don’t own. I’ve never been able to enjoy that alleged truth locally, as I’m not aware of any shop that rents camera equipment within a reasonable driving distance from home. But over the past few years, renting camera equipment online has taken off in popularity, with LensRentals being one of the more well-known providers.

I created my account on a Wednesday afternoon and placed the order to receive the lens on the following Monday and to return it the Monday after that. In filling out the shipment form I asked them to send the lens to my customary third-party shipment service provider, who is authorized to sign to receive packages on my behalf. LensRentals contacted me and asked if they could instead ship directly to a FedEx location, with the reason being that once the package leaves the hands of FedEx, regardless of who signed for it, I become personally liable for whatever happens to the package. So they prefer if it goes directly from FedEx to me.

The package was delivered on schedule to a nearby FedEx location. The lens came lightly wrapped in bubble-wrap, placed inside its standard-issue Canon carrying case, with that placed inside a close-fitting block of foam rubber. The lens was very clean; not mint and pristine, but showing only very light signs of use. I would have been thrilled to buy a used lens in such good condition.

Over the next several days I completed my event photography assignment, which concluded on a Friday evening. Per the LensRentals web page, the next day that I could ship the lens back was Monday (no weekend shipments), so over the weekend I enjoyed taking some personal pictures of Samantha the border collie and a full moon.

Monday morning I packed the lens back up the way it came, slapped the prepaid FedEx return postage onto the box, and dropped it off at the same FedEx location that I picked it up at. The entire LensRentals experience went smoothly flawless, and I look forward to using their services again when I have a short-term need for camera equipment.

Oh, and the lens itself? I don’t have much to add to the scores of reviews online already, but in short, it’s a great lens. My main usage was a moderately well-lit indoor environment. The f4 aperture was plenty to sufficiently freeze motion at a shutter speed around 1/160s, and the image stabilization made that slow of a shutter speed quite usable hand-held. Outside, at 300mm I was able to stay far enough away from Samantha that she usually didn’t find the camera objectionable.

The resulting images were clear and as richly contrasty as I have come to expect from Canon prime lenses. If the focal length and aperture match what you need, and you have an extra $1500, buy one today. If you’d rather spend $1400 less than that, and don’t need 300mm all the time, then rent one for a week from LensRentals.

Three FireWire Audio Interfaces in Six Months

In late 2012, I started using an Echo AudioFire 12 to route analog audio into my 2010 Apple MacBook Pro. On paper, the AudioFire 12 was exactly what I wanted: 12 analog inputs and 12 analog outputs converted to digital and sent across FireWire. I’m using outboard preamps, so I’m not particularly needing the A/D interface to offer built-in preamps. The AudioFire 12 didn’t offer anything fancy; it was an A/D interface with FireWire output.

Initially, all seemed well. Very occasionally when recording I got a spurious digital pop noise on a track. I thought I was probably overdriving something somewhere, and investigated possible causes as a background task.

After a few months, the popping noises increased. About a year after acquiring the AudioFire 12, I was using it for a series of recording sessions, some of which turned out flawless, while others were littered with pops, in some places so bountiful that it came across as a crackling noise.

I learned that these noises are called gaps: essentially, a brief lapse in successful transmission of audio data. In the recorded wave file, when you zoom in far enough, you can see that a normal audio wave is a generally smooth, continuous line. A gap introduces a sudden discontinuity; the wave line jumps from one point to another. Once you locate a gap in the wave file, you can manually fix it by repairing the wave line, making it smooth and continuous again. You can also use a number of software tools to repair gaps automatically.

So I was able to salvage my recording session data, but it was obvious that sloshing around with frequent gaps in recorded audio wasn’t an appealing path forward. Research on the web suggested a number of things to try different in configuring my system, but when all of those failed, I resolved that the problem almost certainly was an incompatibility between the FireWire chipset in the AudioFire 12 and the FireWire chipset in my computer. Had it been a full desktop computer instead of a laptop, I could have installed a secondary FireWire interface card that would hopefully be compatible, but that wasn’t an option for me.

MOTU 828

I decided to sell the AudioFire 12 and replace it with another interface. I had had good success in the past with an audio interface from MOTU (Mark of the Unicorn, based in lovely Cambridge, Massachusetts), and, not needing anything fancy, I bought an original MOTU 828 unit, reportedly the very first FireWire-based audio interface, with eight channels of analog inputs, only two of which sported built-in preamps.

The MOTU 828 was something on the order of 12 years old, but it worked perfectly. Its minimal ability for routing and monitoring audio felt a little archaic, and it lacked the convenient MIDI I/O tacked on to nearly every modern FireWire audio interface, but I was able to make do. It chugged away in service of my home recording studio for an astonishing four months before its internal FireWire chipset flaked out, connectivity between it and the computer failed, and it started emitting a high-pitched squeal which gave me a moderate headache as I tried to make it stop.

I read online that MOTU tech support could, as of 2011 anyway, replace the FireWire ship and refurbish an 828 unit for $75, plus shipping charges. I called MOTU the next morning only to learn that they no longer service the ancient original 828.

So I needed to buy another new interface. I ended up selecting a new interface from Focusrite.

Focusrite Saffire Pro 40

The Echo AudioFire 12 is fairly unusual; audio recording professionals favor using outboard preamps, and buy $2000 interface units that have no built-in preamps. For my needs, I wasn’t looking to spend $2000 on an interface, but, apart from the Echo equipment, I’m not aware of another <$1000 interface that isn’t cluttered with its own preamps.

Built-in preamps are not necessarily bad, but they almost certainly will not be as good as standalone preamps. I have a Focusrite ISA two-channel preamp that sells for $900. The Focusrite Saffire Pro 40 audio interface has eight built-in preamps, one on each input channel, and sells for $500. It doesn’t seem credible that these <$60 preamps would be designed and built as well as a $450 preamp from the same manufacturer. This does, though, get usable preamps into the hands of people who might not have bought them otherwise.

I’ve used the Focusrite Saffire Pro for all of about ten minutes and so far it sounds great. Or I suppose I should say, it doesn’t sound like anything in particular; it functions as an audio interace, converting between analog and digital signals. I have some extra built-in preamps should I need them, and the overall design is (not surprisingly) more modern than the old MOTU 828. It lacks clock input, but I don’t need to synchronize it with anything else at the moment. In addition to the eight analog inputs, it also has digital inputs through ADAT, so I could plug up a really nice A/D converter and use the Saffire just to feed the digital data to the computer over FireWire. The front-panel buttons feel cheaply made out of plastic. (The knobs and power switch feel fine.) I imagine that Focusrite could increase the price by $50 and use the same quality of buttons they use on the ISA preamps.

Basically, it’s a lot like many other similarly-priced interface units. I don’t find these extremely interesting in themselves, but rather, just a needful component in recording audio. They do though become very interesting when they don’t work correctly in one way or another. Hopefully this brand new Focusrite Saffire will function well for years to come.

Adobe Lightroom: Toward a Film Look

Here's a digital picture of Samantha the border collie, minimally processed with iPhoto:

And now with some edits in Adobe Lightroom, toward approximating a film look:

The extreme ends of the RGB tone curve slightly moved toward the center, taking the edge off of both really dark colors and really bright colors; lower color temperature; a slight reduction in contrast (a la Fuji NPH 400), and a touch of simulated film grain.

Recurrent Training for Software Developers?

Software developers who have been out of school for a while and apply for new jobs routinely bemoan that they have forgotten details of things like algorithms and data structures and other computer science topics that tend to pop up for discussion in interviews. Even software developers working in their current job can stand to be reminded of things they've been taught in the past but haven't thought about for years.

Pilots undergo intense training before getting certified to fly in the first place, but then also must undergo recurrent training on a regular basis.

Instead of checking off having learned algorithm design in college and never thinking about it again, would it be useful for software developers to engage in regular recurrent training to refresh themselves on things they might be letting slip from their thinking?

Taking a full semester-long class might be overkill, and too much to expect out of the schedule of working professionals with families and responsibilities outside of their jobs. Maybe a twice a year spend a day or two being refreshed on things that should have already been studied in detail in the past.

A format of alternating between 15-minute lectures, 15-minute in-class exercises (done by individuals on their own laptops), and 15-minute review of solutions might be a good way to go.

How could such training be set up? Large companies could do it all in house, paying a small staff to be dedicated to administering such classes. Local community colleges could potentially offers this format of class to small companies and individuals in the area, as part of their alleged charter to foster continuing education.

In theory, it could also all be done online, with prerecorded lectures by especially great speakers, but spending the time to meet in person can sometimes be more motivating than watching videos.