Stack Overflow at 10

There’s a certain amount of irony in the proposition that one response in the comments on Jeff Atwood’s post commemorating the 10th anniversary of the launch of Stack Overflow was to suggest that the post be marked as a duplicate of https://stackoverflow.blog/2018/09/27/stack-overflow-is-10/.

I think Jeff Atwood puts it best himself:

Interesting, so we can close posts as duplicates across completely different websites now? Fascinating. I hope all websites on the internet get the memo on this exciting new policy!

For what it’s worth I’m the most amateurish of programmers, and over the years I’ve found Stack Overflow immensely useful. Read the answers carefully and there’s an astounding amount of useful information in there.

Standard Notes

On my radar, for if (when) Evernote stumbles: Standard Notes

A writing experience unlike any other. Standard Notes is free to use on every platform, and comes standard with cross-platform sync and end-to-end privacy. For those wanting a little more power and flexibility, we created Extended, which unlocks powerful editors, themes, and automated backups.

There’s an argument to be made that Evernote has been stumbling from the moment it aspired to become a Unicorn, but I’m thinking more of the way the company recently started haemorrhaging senior executives and seems directionless. The only saving grace it has right now from where I’m sitting is that it isn’t OneNote, which I have use of at work and which plainly satisfies the needs of lots of people who are deeply tied into the Microsoft Office ecosystem but which definitely isn’t for me, especially not when I do my personal computing nowadays on iOS.

[Via 4 Short Links, via Things That Have Caught My Attention]

Root?

Darius Kazemi might just be some kind of evil genius:

I gave a talk at CornCon 2018 about the history of the cron utility in UNIX systems, in the character of a man who gradually realizes that he is not speaking at CronCon, a conference about the time-based scheduler, but rather at CornCon, a conference about the cereal grain, also known as “maize”. Thanks to Casey Kolderup for taking video, and Jen Tam for hosting me.

Be sure to follow the link to see his entire performance. The moment when he started on the significance of root in the two contexts at hand, I just lost it.

[Via A Whole Lotta Nothing]

Choices, choices

As of Google Chrome version 69, Google are treating being logged in to any Google service as the same thing as being logged in to Google Chrome:

Most Google services have for me this in common with Facebook: these services are too deeply integrated and impossible to use in part or isolation. It’s either the entire system or nothing, based on how the question of consent is approached. You would like to use GMail (logged in obviously) but Google search, Youtube, Chrome etc without a login? No can do. You selected strict settings in Facebook for your profile data? You’re just an API/permission redesign away from having your choices nullified. Part of me feels that this Chrome shared computer issue that Googlers mentioned is real, but it’s also just too convenient to solve this by tieing Chrome closer to Google, you know? Note to Google: any time you find the software engineering decisions you’ve made being compared with those made by Facebook, that’s probably not a good thing for your end users these days.

[Via Extenuating Circumstances]

Love Notes to Newton

I’d somehow failed to notice that a documentary about the Apple Newton had been released: Love Notes to Newton is a mix of historical footage about the machine’s development and tributes to the dwindling band of Newton aficionados who have tried hard to keep their Newtons in daily use in the modern world where the smartphone in your Pocket utterly outclasses its ancestor.

It’s fair to say that the Newton was an inspiring failure: Palm were the most visibly successful company that tried to follow in the Newton’s footsteps, but they didn’t ever get beyond the geek market. [note]I know: I owned several Palm products, and a number of Psions before that.[/note] While few users refer to their smartphones as a PDA that’s just what it is. The biggest difference between a smartphone/PDA and a Newton is that the Newton’s operating system took great pains to revolve around collections of object-oriented data that it made available to any other program on the device, where modern smartphones run standalone Apps and tend to have tighter constraints on what data is visible to different apps. To a large extent, if you can trust Newton fans to be objective for a minute, is that smartphones substitute sheer processor horsepower for smart software.

It’s tantalising to wonder what could have happened if the Newton had survived a bit longer after the return of Steve Jobs to Apple: might the improvements in Newton OS 2 (and whatever might have come to pass in Newton OS 3 if they’d got that far) have allowed the platform to flourish, or was it unfortunate enough to be a revolutionary product from a company that couldn’t afford to wait for it to outgrow the bad reputation it was saddled with because they over-promised what it was one day going to be capable of, and doubly cursed because it was a highly visible effort by a recently ousted CEO to be a visionary in the mould of his predecessor/successor?

The thing is, right now Apple’s iOS team would look at this documentary and think it couldn’t happen to them. It not only can, but one day it almost certainly will.[note]If Apple are lucky, whatever comes in the wake of the success of iOS will be an Apple product that somehow delivers sufficient backwards-compatibility – if only in terms of the multimedia formats it supports – to lock customers into the Apple ecosystem. If Apple are unlucky, the story of iOS 16 will be that everyone flocking to whatever new toys Android/Microsoft/Samsung/Huawei have in the market and nobody will care what iOS 19 brings to the party.[/note]

Anyway, Love Notes to Newton is definitely worth a watch if you have any sense of how things were when John Sculley was running the show and it wasn’t at all clear where Apple’s next hit product was coming from.

[Via 512 Pixels]

One Day

I guess the reason people keep coming back to being inspired by Vannevar Bush’s Memex, Alan Kay’s Dynabook and what have you is that the internet as it currently exists falls so far short of the dream of what a global information network could have been. Thank goodness that dreams like this keep popping up:

[This is going to be…] a very rough sketch of an idea about what a future computing system might look like. I don’t know how to get from here to there, or even if ‘there’ is entirely satisfactory. But I feel that a ‘there’ roughly in this vicinity is somewhere we should be heading towards.

Let’s start with what the ‘here’ is that is less satisfactory.

We currently have an Internet made of vast layers of complexity layered on each other; software layers going back to the 1960s at the very latest, built on traditions and workflows originated in the 1950s. Our current model of deploying computing services, ‘the cloud’, thinks nothing of **simulating entire computers* – with gigabytes of RAM and hundreds of gigabytes of disk – on other computers, just to get one service that listens on one TCP/IP port and sends a few bytes in response to a few other bytes. [Emphasis added]

The operating system inside these simulated computers-on-computers then consists of, essentially, an entire simulated computing department from the 1950s: a bank of clerks operating card punches (text editors and Interactive Development Environments), other clerks translating these punchcards from high-level to low-level languages (compiler toolchains), machine operators who load the right sets of cards into the machine (operating systems, schedulers, job control systems), banks of tape drives (filesystems and databases), printers (web servers, UIs )… and a whole bunch of prewritten software card stacks (libraries, component object systems, open source projects).

This seems a bit less than optimal. […]

Just a bit, yes.

The point isn’t that this essay points to an obvious right answer: it’s that the current solutions fall so far short of what could be done. Building ever-taller stacks of old technology on top of stacks of even older technology might be good for maintaining the market share of the market leaders, but it’s probably not the best way to get to where we’d like to be one day.

Anyway, the point is the essay I’ve linked to offers plenty of food for thought.

[Via Extenuating Circumstances]

LinkedIn: The Game

Beating LinkedIn: The Game is tricky, but not impossible. If you can believe this guy:

The general goal of LinkedIn (the game) is to find and connect with as many people on LinkedIn (the website) as possible, in order to secure vaguely defined social capital and potentially further one’s career, which allows the player to purchase consumer goods of gradually increasing quality. Like many games, it has dubious real-life utility. The site’s popularity and success, like that of many social networks, depends heavily on obfuscating this fact. This illusion of importance creates a sense of naive trust among its users. This makes it easy to exploit.

To novices, the game appears to be open-ended, and impossible to “beat” in any clearly defined sense. But it is, in fact, possible to win at LinkedIn. I have done so, and you can too, by following this short strategy guide. […]

This would be even funnier if I could just shake the premonition that a few years from now some high-flying junior minister in the DWP will announce that in the interests of reassuring hard-working taxpayers that their hard-earned money was being used to fund the most agile, modern and thoroughly digital solution to the problem of unemployment available, all claimants of Universal Credit would be required to provide evidence that they had registered with Microsoft’s LinkedIn service and that they had pursued at least 10 job opportunities a week. Even more importantly, Microsoft had kindly agreed to take up a contract to police this target and consequently a portion of existing DWP staff in Jobcentres would be transferring to the private sector to work in the new MSDWP service, which would also be taking over the contract to run the Universal Credit system.

Magically, this move would both allow the DWP to wash their hands of all responsibility for administrative cock-ups in Jobcentres, but also bring to an end all those boring National Audit Office reports that kept on rating the Universal Credit programme as risky and over budget.[note]Not because of Universal Credit would suddenly become a useful or helpful service or anything ridiculous like that. It’s just that, sadly, the commercial confidence clause the DWP had agreed in setting up of MSDWP would make it a criminal offence to reveal details of the working of Universal Credit to mere elected members of Parliament or their civil servants. Oh well, if that’s the price of doing modern business in the digital age then so be it.[/note] You might laugh, but give it a few years and some Ayn Rand-reading acolyte a decade or so out of university and a couple of years into his or her tenure as a Conservative member of Parliament will think this the best way to distance the government from the embarrassment of Universal Credit. The main problem will be finding someone within Microsoft both senior enough to agree a deal of that size and dumb enough to not recognise this for the hospital pass that it would be.

[Via The Tao of Mac]

Atari 520 STM

Having just read The Jackintosh: A Real GEM – Remembering the Atari ST, I feel a massive nostalgia rush coming on:

After Commodore Founder Jack Tramiel was forced out by his board, he decided, after a brief hiatus, to get revenge.

Tramiel knew that a 16-bit computer was next on the horizon for Commodore, and he wanted to beat them to the punch. So, in early 1984 he formed a new company, Tramel Technology (spelt without an ‘i’ to encourage people to spell his name correctly), and lured a number of Commodore engineers to jump ship and come work for him. […]

Back in the late 1980s, after several years of following Sinclair Research’s product line up to and including the Sinclair QL[note]Which, I maintain to this day, had the potential to be a fine machine if someone could just have persuaded Sir Clive to drop the Not-Invented-Here attitude so that they could equip it with a decent keyboard and ditch the bloody MicroDrives! At one point, I had my QL, the circuit board re-housed in a decent keyboard and the maximum RAM (128KB on board plus a whopping great 512KB in the expansion slot!) installed, happily multitasking with multiple copies of Psion’s Abacus spreadsheet and Quill word processor happily coexisting thanks to a patch to QDOS – I wish I could remember the name! – that enabled limited multitasking/task switching that worked remarkably smoothly.[/note] I found myself tempted by the Atari 520STM, the model with a decently high-resolution (for the day and price) monochrome monitor. OK, so the 520STM was never going to be a games machine, but it was a cracking little workhorse for Desktop Publishing (I adored Timeworks Desktop Publisher) and I spent way too much money on nifty GEM-based word processors and spreadsheets over the years. That first version of Digital Research’s GEM environment worked beautifully on the hardware, to the point that several years later when I finally gave in to the rising tide and bought a Windows 95-based machine for my personal use (having long since been using DOS/Windows systems at work) I genuinely felt like I was taking a step down usability-wise and looks-wise.

[Via Extenuating Circumstances]