A fascinating account of the history of JAWS and NVDA.
If only all thinkpieces on complexity in software development were written in such an entertaining style! (Although, admittedly, that would get very old very fast.)
A layman’s guide to thinking like the self-aware smol brained
I believe that we haven’t figured out when and how to give a developer access to an abstraction or how to evaluate when an abstraction is worth using. Abstractions are usually designed for a set of specific use-cases. The problems, however, start when a developer wants to do something that the abstraction did not anticipate.
Smart thoughts from Surma on the design of libraries, frameworks, and other abstractions:
Abstractions that take work off of developers are valuable! Of course, they are. The problems only occur when a developer feels chained to the abstractions in a situation where they’d rather do something differently. The important part is to not force patterns onto them.
This really resonated with parts of my recent talk at CSS Day when I was talking about Sass and jQuery:
If you care about DX and the adoption of your abstraction, it is much more beneficial to let developers use as much of their existing skills as possible and introduce new concepts one at a time.
Time and again, organizations have sought to contain software’s most troublesome tendencies—its habit of sprawling beyond timelines and measurable goals—by introducing new management styles. And for a time, it looked as though companies had found in Agile the solution to keeping developers happily on task while also working at a feverish pace. Recently, though, some signs are emerging that Agile’s power may be fading. A new moment of reckoning is in the making, one that may end up knocking Agile off its perch.
I’m glad that Heydon has answered this question once and for all.
I’m sure that’ll be the end of it now.
This is a wonderful piece of writing by Marcin, ostensibly about bug-fixing but really an almost existential examination of the nature of coding.
Bugs are, by definition, a look backward—at past behavior, at code that already exists, at the old work of engineers whom you’ve never met. It can feel more fun to write new code, chart new territories, add new functionality.
But the past can be fun, too. A good bug is a puzzle. A mystery. A whodunit. To solve a bug, sometimes you have to be a scientist: observe and measure, chart out the logic, follow the math. But then, two minutes later, you need to wear a hat of a very particular detective—take your flip notepad and interview different pieces of code to understand not what they claim they do, but what they actually do.
Ah, this brings back memories of hacking on the WorldWideWeb project at CERN!
(Not the original one. I’m not that old. I mean the recreation.)
The paradox of performance:
This era of incredibly fast hardware is also the era of programs that take tens of seconds to start from an SSD or NVMe disk; of bloated web applications that take many seconds to show a simple list, even on a broadband connection; of programs that process data at a thousandth of the speed we should expect. Software is laggy and sluggish — and the situation shows little signs of improvement. Why is that?
Because we prioritise the developer experience over the user experience, that’s why:
Although our job is ostensibly to create programs that let users do stuff with their computers, we place a greater emphasis on the development process and dev-oriented concerns than on the final user product.
We would do well to heed Craig’s observations on Fast Software, the Best Software.
If you employ a hack, don’t be so ashamed. Don’t be too proud, either. Above all, don’t be lazy—be certain and deliberate about why you’re using a hack.
I agree that hacks for prototyping are a-okay:
When it comes to prototypes, A/B tests, and confirming hypotheses about your product the best way to effectively deliver is actually by writing the fastest, shittiest code you can.
I’m not so sure about production code though.
New technologies don’t have power; for that they’d need a community, documentation, and a thriving ecosystem of ancillary technology. What they have is potential, which resonates with the potential within the startup and the early adopter; perhaps they can all, over time, grow together.
This means startups don’t adopt new technologies despite their immaturity, they adopt them because of that immaturity. This drives a constant churn of novelty and obsolescence, which amplifies the importance of a technologist’s skillset, which drives startups to adopt new technologies.
This flywheel has been spinning for a long time, and won’t stop simply because I’ve pointed out that we’re conflating novelty with technological advancement. Hopefully we can slow it down, though, because I believe it’s causing real harm.
Languages, platforms, and systems that break from the norms of computing.
I don’t think I agree with Don Knuth’s argument here from a 2014 lecture, but I do like how he sets out his table:
Why do I, as a scientist, get so much out of reading the history of science? Let me count the ways:
- To understand the process of discovery—not so much what was discovered, but how it was discovered.
- To understand the process of failure.
- To celebrate the contributions of many cultures.
- Telling historical stories is the best way to teach.
- To learn how to cope with life.
- To become more familiar with the world, and to know how science fits into the overall history of mankind.
I never knew that the way I add other people’s code to my projects is called “vendoring.” I thought it was just copying and pasting.
The title says it all, really. This is another great piece of writing from Paul Ford.
I’ve noticed that when software lets nonprogrammers do programmer things, it makes the programmers nervous. Suddenly they stop smiling indulgently and start talking about what “real programming” is. This has been the history of the World Wide Web, for example. Go ahead and tweet “HTML is real programming,” and watch programmers show up in your mentions to go, “As if.” Except when you write a web page in HTML, you are creating a data model that will be interpreted by the browser. This is what programming is.
Don’t blame it on the COBOL:
It’s a common fiction that computing technologies tend to become obsolete in a matter of years or even months, because this sells more units of consumer electronics. But this has never been true when it comes to large-scale computing infrastructure. This misapprehension, and the language’s history of being disdained by an increasingly toxic programming culture, made COBOL an easy scapegoat. But the narrative that COBOL was to blame for recent failures undoes itself: scapegoating COBOL can’t get far when the code is in fact meant to be easy to read and maintain.
It strikes me that the resilience of programmes written in COBOL is like the opposite of today’s modern web stack, where the tangled weeds of nested dependencies ensures that projects get harder and harder to maintain over time.
In a field that has elevated boy geniuses and rockstar coders, obscure hacks and complex black-boxed algorithms, it’s perhaps no wonder that a committee-designed language meant to be easier to learn and use—and which was created by a team that included multiple women in positions of authority—would be held in low esteem. But modern computing has started to become undone, and to undo other parts of our societies, through the field’s high opinion of itself, and through the way that it concentrates power into the hands of programmers who mistake social, political, and economic problems for technical ones, often with disastrous results.
I decided to implement almost all of the UI by just adding & removing CSS classes, and using CSS transitions if I want to animate a transition.
Yup. It’s remarkable how much can be accomplished with that one DOM scripting pattern.
I was pretty surprised by how much I could get done with just plain JS. I ended up writing about 50 lines of JS to do everything I wanted to do.