November 17th, 2013
15 Sorting Algorithms in 6 Minutes. Be sure to turn the sound up – it's half the fun of watching how each algorithm works.
15 Sorting Algorithms in 6 Minutes. Be sure to turn the sound up – it's half the fun of watching how each algorithm works.
From McSweeney's, Retail Therapy: Inside the Apple Store…
When Apple employees are asked what they love most about their job (and they are asked often) most invariably answer "the people." They mean their co-workers, not the customers.
Because the daily expectations for customer service go beyond anywhere else in retail, only those with managerial ambitions will invoke their commitment to helping people. Some thrive on that. Others get diagnosed with PTSD. Consider that the flagship store on Fifth Avenue in New York City is open 24 hours and has more annual foot traffic than Yankee Stadium, yet only one door. Every day, in every Apple Store, people flood to customer service, when what many truly need is therapy.
On the face of it, a typical set of retail customer service war stories. Until the last customer's story, which is something else entirely, a reminder of how personal our modern personal computers have become.
A slice of prime early 1980s computing nostalgia, served up for British computer geeks of a certain age by The Register:
They would, Clive Sinclair claimed on 23 April 1982, revolutionise home computer storage. Significantly cheaper than the established 5.25-inch and emerging 3.5-inch floppy drives of the time – though not as capacious or as fast to serve up files – 'Uncle' Clive's new toy would "change the face of personal computing", Sinclair Research's advertising puffed.
Yet this "remarkable breakthrough at a remarkable price" would take more than 18 months more to come to market. In the meantime, it would become a byword for delays and disappointment – and this in an era when almost every promised product arrived late.
Sinclair's revolutionary product was the ZX Microdrive. This is its story. [...]
It was a pity that Sinclair botched the ZX Microdrive so badly: it was a tragedy that the QL relied upon Microdrives.1 I tell you, with floppy disk drives, a decent keyboard and a finished operating system, the QL could've been a contender.
In the future, you have access to all your data. Memory, or the lack thereof, is no longer discussed. It is only assumed, a feature of modern life, since you can now relive all your past data as experiences. But because of "technical constraints," all of your experiences are taxonomized and merged for ease of efficiency/retrieval. To access your past, then, is to relive each experience – in real time, all at once.
You spend seven weeks holding your iPhone to your ear on hold.
You pull to refresh for seven months, click to refresh for nine.
You miss 30 Thanksgiving dinners restarting your laptop.
12 Valentine's Days restarting your iPhone.
You swipe past iPad ads for 48 hours before ever seeing content.
(In fairness, I should note that the copy above is at 50% of the size of the original, which serves to mask some of the rough edges. Follow the link to see the album covers in all their pixillated, colour-clashing glory.)
Nice work. It's surprising how nicely some of them turned out.
The Sensual World and 50 Words for Snow benefit from being essentially black and white images in the first place, so the dithering doesn't fall foul of the limitations of the Spectrum's graphics display,1 but some of the more colourful later albums like Aerial and Director's Cut look pretty damned fine all things considered. The run of albums from Lionheart to Hounds of Love is another matter entirely…
One last thought: we should all be eternally grateful that the creator of these tribute images didn't accompany them with reproductions of Kate's music created using a Spectrum's sound chip.
Nine out of the eleven pictures are of scrollbars from Apple's MacOS and iOS or Microsoft Windows, with one of the other two from NeXTstep (a.k.a. MacOSX's eccentric uncle) and the other of the Xerox Star (a.k.a. the grandfather of every other GUI shown). No room for scrollbars from other interesting Graphical User Interfaces from the 1980s and early 1990s?1 For shame…
|Digital Research GEM||Commodore Amiga Workbench||Acorn RISC OS||Palm OS||Psion EPOC32||X Window|
[Via Daring Fireball]
Jeff Atwood's post about the response of Stack Overflow's users to a request for examples of New Programming Jargon includes some real doozies.
18. Common Law Feature
A bug in the application that has existed so long that it is now part of the expected functionality, and user support is required to actually fix it.
I have to admit that back when I was teaching myself Visual Basic for Applications, my work included multiple instances of Stringly Typed functions. Even now, I struggle against the temptation to leave Ninja Comments.
All in all, it's just as well that I'm not being paid to write code.
I swear this sort of thing makes me want never to set foot in an Apple Store again:
Beneath all the chillness and chirpiness [of an Apple Store] is a consumer destination whose whimsy is the result of painstaking calibration. Think Disney World's underground tunnels, except with all the draconianism out on display and integral to the whole aesthetic. The products placed on blond-wood tables at precisely measured intervals. The reservations-only appointment system at the Genius Bar. The Five Steps of Service. The fact that Jon's beard is trimmed to a uniform three inches. It takes a lot of work to stay this relaxed.
Turns out, though, that there's one more bit of precision required to make the Apple Store so Apple-y. The notebook computers displayed on the store's tabletops and counters are set out, each day, to exactly the same angle. That angle being, precisely, 70 degrees: not as rigid as a table-perpendicular 90 degrees, but open enough — and, also, closed enough — for screens' content to remain visible and inviting to would-be typers and tinkerers.
The point, explains Carmine Gallo, who is writing a book on the inside workings of the Apple Store, is to get people to touch the devices. [...]
John Dupuis reviews Nine algorithms that changed the future by John MacCormick:
John MacCormick's new book, Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers, is very good. You should buy it and read it.
Among all the debates about whether or not absolutely everybody must without question learn to program [...] it's perhaps a good idea to pause and take a look at exactly what programs do.
Which is what this book does. It starts from the premise that people love computers and what they can do but don't have much of an idea about what goes on inside the little black box. And then, what MacCormick does is take nine general types of high level functions that computer perform and explain first what those functions really mean and second a general idea of how software developers have approached solving the initial problems. [...]
Sounds like something I'd enjoy. The Kindle edition1 is quite expensive so I'm not going to rush and buy it now, but I'll certainly be interested in picking up a copy at a reasonable price once it shows up in paperback.
Lo, in the twilight days of the second year of the second decade of the third millennium did a great darkness descend over the wireless internet connectivity of the people of 276 Ferndale Street in the North-Central lands of Iowa. For many years, the gentlefolk of these lands basked in a wireless network overflowing with speed and ample internet, flowing like a river into their Compaq Presario. Many happy days did the people spend checking Hotmail and reading USAToday.com.
But then one gray morning did Internet Explorer 6 no longer load The Google. Refresh was clicked, again and again, but still did Internet Explorer 6 not load The Google. Perhaps The Google was broken, the people thought, but then The Yahoo too did not load. Nor did Hotmail. Nor USAToday.com. The land was thrown into panic. [...]
[Via Pop Loser]
I strongly suspect the image of an iPad strapped to the USB Typewriter is causing the late Mr Jobs to do somewhere in the vicinity of 200rpm even as I type this.
[Via Memex 1.1]
Despite the fact that I can count on the fingers of one hand the number of times I've had cause to design an icon over the course of the last decade, I'm sorely tempted to treat myself to a copy of The Icon Handbook by Jon Hicks just on the basis of the quantity and quality of eye candy on display therein:
This is a book that I've been wanting to write for a long time. Whenever I've looked for a book on this subject, the only available publications are reference guides that simply reproduce as many symbols as possible. Where books have gone into theory, they were published decades before desktop computers, and therefore miss the most relevant and active context of icon use. Sometimes the topic is covered as a part of a book about logo design, and amounts to little more than a page or two. So I've set out to create the manual, reference guide and coffee table book that I always desired. [...]
A sample of the book's content can be downloaded from the publisher's site if you want to see what I mean.
[For the record: I have no connection with Jon Hicks, other than having read his journal for some years now and admired his Helvetireader theme for Google Reader. All the evidence is that he knows what he's talking about.]
[...] Instead of working to make everything visible to the user, Apple's industrial and graphic designers, now fully in command, are doing just the opposite: Apparently bereft of even the barest knowledge of behavioral (HCI) design, they have busied themselves hiding everything they can, increasing visual simplicity at the expense of actual simplicity. Then, they pretend both to themselves and to us that the only instruction you'll ever need for an iPad is, "Turn it on." iPad users are left to stumble around, trying to find the things they need to get their work done, things so carefully hidden that without a friend to help them, they are unlikely to ever find them.
Case in point: At some point in the past, perhaps the distant past, Apple added the capability to jump from letter group to letter group by holding down on the letter column, rather than just stabbing at your letter of choice (and usually missing). After four years of using iDevices, during the course of writing this column, I accidentally held down for a second on an alpha character, causing the slide bar to appear. I never knew before that moment that hold-and-slide even existed in Contacts. Principle: If a capability is not visible and the developer does not teach that capability, it may as well not exist.
Damned straight! I had no idea the slide bar existed until I read that last paragraph earlier this evening.
Come on Apple, you can do better than this…
[Via Daring Fireball]
[...] Rising numbers of mobile, lightweight, cloud-centric devices don't merely represent a change in form factor. Rather, we're seeing an unprecedented shift of power from end users and software developers on the one hand, to operating system vendors on the other – and even those who keep their PCs are being swept along. This is a little for the better, and much for the worse. [...]
[Via The Brooks Review]
Future Drama: a compilation of designers of the (mostly quite recent) past's visions of the future, with a particular emphasis on videos depicting futuristic technology being deployed in real world situations.
You know the sort of thing: currently the trend is to depict elegantly dressed rich people toting around ultrathin tablet computers that they control via touch interfaces (often with some form of holographic display) whilst engaged in their job as a knowledge worker and/or high powered executive. Back at their hotel room after a hard day's collaboration, they use the device as a fancy videophone to chat with their cute pre-teen daughter back home about how school went today. 1 2 3
I snark, but I do find this sort of speculative work fascinating. Also, the Matt Jones blog post that pointed me in this direction is well worth a read: I've always seen this sort of video as a marketing tool aimed at gaining mindshare, but he's found that for designers placing their ideas on screen in the context where they'll be used can be immensely valuable, insofar as it helps them assess whether their ideas 'fit' in the real world. Good stuff.
Turbo Pascal 3 for MS-DOS was released in September 1986. [...]
The entire Turbo Pascal 3.02 executable–the compiler and IDE–was 39,731 bytes. How does that stack up in 2011 terms? Here are some things that Turbo Pascal is smaller than, as of October 30, 2011:
The minified version of jquery 1.6 (90,518 bytes).
The image of the white iPhone 4S at apple.com (190,157 bytes).
The Wikipedia page for C++ (214,251 bytes).
Not yet having made the switch to PCs at that point, I was probably still using HiSoft Pascal 4 on my ZX Spectrum, which was similarly minimalist by modern standards. But then, my Spectrum had 48 Kilobytes of RAM, so everything had to be distinctly minimalist.1
[Insert standard "I feel so old!" lament here.]
[Via The Tao of Mac]
File under 'Famous last words': Computer Virus Hits U.S. Drone Fleet:
A computer virus has infected the cockpits of America's Predator and Reaper drones, logging pilots' every keystroke as they remotely fly missions over Afghanistan and other warzones.
The virus, first detected nearly two weeks ago by the military's Host-Based Security System, has not prevented pilots at Creech Air Force Base in Nevada from flying their missions overseas. Nor have there been any confirmed incidents of classified information being lost or sent to an outside source. But the virus has resisted multiple efforts to remove it from Creech's computers, network security specialists say. [...]
"We keep wiping it off, and it keeps coming back," says a source familiar with the network infection, one of three that told Danger Room about the virus. "We think it's benign. But we just don't know." [...]
Retaliation for Stuxnet, or someone too high up the chain of command to be told what to do getting a bit careless with their USB drive and bringing in some malware they picked up on their home PC?
Steven Levy's obituary for Steve Jobs is probably the best all-round non-technical summary I've read today of the life of the man who changed computing:
The full legacy of Steve Jobs will not be sorted out for a very long time. When employees first talked about Jobs' "reality distortion field," it was a pejorative – they were referring to the way that he got you to sign on to a false truth by the force of his conviction and charisma. But at a certain point the view of the world from Steve Jobs' brain ceased to become distorted. It became an instrument of self-fulfilling prophecy. As product after product emerged from Apple, each one breaking ground and changing our behavior, Steve Job's reality field actually came into being. And we all live in it.
Detractors will say – correctly – that few of Apple's products were truly the first of their kind. The Apple II was competing with Commodore's PET, the TRS-80 and a host of Z80-based S100 bus systems. The Mac and the Lisa were inspired by the Xerox Star. The iPod wasn't the first portable MP3 player. The post-1997 Macs increasingly used industry standard PC components, to the point where since the switch to Intel processors you could use your Mac as a bog standard Windows PC if you were so inclined.1 The iTunes Store wasn't the first online music store, it was just the one that benefitted from being slickly integrated with the world's best selling MP3 player. The iPad is far from being the first tablet computer the world has ever seen.
And yet … in between genuinely groundbreaking devices like the original Mac and the iPhone, Jobs and Apple kept on producing computing devices that were better designed, worked better and were continually updated instead of being milked for profits. If Apple didn't make something first, it had an enviable track record of making it better. "It Just Works" was the slogan: it wasn't 100% accurate – computers are complicated machines doing complicated things, and there's a limit to how far even Apple can keep them from falling over at the most inconvenient moment possible – but Apple have come closer to making the slogan reality than any other IT company in the microcomputer era.
Doing that once could be down to luck. Doing it two or three times would be a neat trick. Pulling it off umpteen times over the course of some thirty-odd years tells you that the company had something special. With all due respect to Woz and Jef Raskin and Jonathan Ive and Tim Cook and the many people who made MacOS X the nicest desktop Unix system in creation and created all the other minor miracles Apple has produced over the last 15 years, it's pretty clear that Steve Jobs was that something special.