7 Comments
Feb 3Liked by David Kingsley, PhD

And the Luddite checks in: Hard pass. The chip technology will not stay one-way. It will be a very, very short step from using implants to direct your thoughts, to having your thoughts directed by the implants. At which point they will no longer be "yours" in any meaningful way. Anyone who enjoys the non-stop software updates to his phone will love the updates to his chip’s software. If you don’t believe me, ask me: Canada’s government produced this https://archive.is/d3dOt irritatingly-written document, “Exploring Biodigital Convergence,” exactly four years ago and lists “Change human beings—our bodies, minds, and behaviours” as a goal by page four. “Change” will no longer be a voluntary, internal, self-directed process. It will be imposed on you by the people who know better than you what you *should* be. And who approved Neuralink implants? The FDA? That's the most captured regulatory agency in world history. I'd eat Dumpster lettuce before I trusted anything the FDA said, including, “You're standing under a falling piano. Move over.”

The NSA (and FBI and CIA and DoD) is in your email, texts, and phone calls now. Everything you write, everything you say, and everything you look up on the web. It’s called “data mining” and not “data carefully curated after obtaining a warrant” for a reason. Do you have a driver’s license? Your photograph is in the feds’ facial recognition surveillance database. The US is a country in which on a good day our privacy is already considered optional, and far more often as an obstacle to people who want to control, surveil, monitor, and change us. Anyone who clicked the “fascinated” button on the survey here should use the freedom he still has to search “fusion center” and “real time crime center” and then reconsider whether “fascinated” is really the word he’s going for.

Putting a chip in someone's head is no more evolution in a legitimate biological-genetic sense than a pacemaker is. I don’t know what Musk means by "increasing the bandwidth between humans and computers,” but that sounds like a computer problem, not a human problem: It’s the machines that should be adapted to serve us, but that’s clearly not the point of this technology. AI already makes it extremely hard to distinguish real from fake, and I agree that it’s an existential risk to humanity, so exactly how does embedding it in a human brain mitigate that risk? It doesn’t. It can’t.

Quadriplegics aside, why would anyone want to link his brain to a machine, anyway? To what end? Everyone already walks around with a supercomputer in his pocket, and what do people do with those incredible multi-terabyte all-in-one talk-film-write thingies besides judge each other on Tinder? Four years ago people in this Information Age couldn't even be bothered to look up "common cold" on Wikipedia, and they haven’t done it since, either. Exchanging a swipe or tap for a thought will not transform such people into browsers from the tree of knowledge. But then, this technology is not for them, although it's packaged and sold that way. It's for the controllers of the technology, who want to control *people*. Even if you turn your phone off, you can still be—and are—tracked everywhere you go. Why? Who benefits from that? Not the phone owner. There’s no innocent, benevolent reason for non-stop surveillance. What, so emergency services can find you the next time you stray from a national park trail and get lost? Pfft. The decision to allow a chip to be implanted in your brain is irrevocable. Even if you retain the power of decision-making after that, what happens if you want the thing removed? That’s not a kitchen table operation. That is a leash that you will never get off your neck.

Whether it’s called One ID or Neuralink or Contact Tracing or Vaccine Passports: It's not that I don't trust the people developing this stuff. It's what I trust them to do with it.

Expand full comment
author

Fager, your thoughts are insightful (as usual) and I appreciate the pushback on this from us Technophiles.

First, I agree that this technology going from simply 'reading' to also 'writing' is an obvious next step. I doubt there will even be many major hurdles to perform this. In fact, there are already systems that do this in experimental labs. In one example, researchers use a microelectrode arrays embedded within a petri dish as a 'read and write' for neurons grown inside. Within this 'read-write system', the researchers were able to teach neurons to play the Atari game 'Pong'. The academics performing this research now have a startup company called 'cortical labs'. If there is interest here, I could similarly do a deep dive. However, before we start discussing 'reading and writing', let's first discuss the reading aspect.

I disagree that the act of 'reading' neural signals is inherently negative. Imagine, for a moment, a future where non-invasive devices like headsets could seamlessly translate our thoughts into actions. This isn't about usurping our natural abilities but augmenting them, reducing the lag between intent and interaction—much like how digitizing libraries transformed our access to information, enhancing the bandwidth of our knowledge exchange. These types of innovations traditionally have longterm benefits towards technological progress.

Okay, now I'll come back to operating in the real world. The specter of surveillance and privacy invasion looms large for me as well. The Snowden revelations were a chilling moment for me and I have never quite gotten over the way our the shameless spying we are now all expected to live with (there is no reason good enough). The further potential for our innermost thoughts to be accessed, or even altered, by government agencies without consent is indeed a chilling prospect. We have a fundamental rights to privacy and autonomy, perhaps we should be finding ways to protect it.

A while ago, I read a metaphor that societies push towards technological advancements and innovations is like drawing balls from an "Urn of Knowledge", which can be a double-edged sword. Each new discovery brings potential for both progress and peril. I can see the rational for being a Luddite.

Expand full comment

After I posted that comment I told someone, "He must wish he could unsubscribe me," because all I do is come here and leave over-long, nay-saying screeds. In my defense, I think nearly all the technology you cover here is positive—maybe not inherently, but definitely potentially. Four years ago I’d have chomped on it like a bass, without reservation, or with very few. I wouldn’t have believed it possible for people to twist something that’s next-door to miraculous into something so menacing. But, you know: They do, and the worst of them are on record as wanting to. They can’t stop announcing their plans, like the villains on Doctor Who.

You know that Leonard Read essay, "I, Pencil"? I can look at a photo of a gas chromatograph or of a hospital trauma bay and be blown away with admiration for the chemistry and physics and processes that go into producing a lowly ten-blade or a single component--just the damned housing--of a single machine that goes "ping." I love being gobsmacked by the intelligence, imagination, creativity, and skill of people working in science and tech. I love reading about it. Even though—or maybe because—it’s mostly way over my head, it’s inspirational. Most human eras, including this one, don’t offer enough reasons to look up and look ahead. Having learned what I have since 2020, though, I read about some of these tech developments and feel like I’m tied to a chair and locked in a room with a toddler that’s waving a loaded gun.

I followed Cortical Labs' link to their cell.com paper and left that none the wiser, so yes, I'd be interested in being explained to about their work with neurons. The other head-scratcher was the light-powered yeast for biofuels, although having since read Wikipedia (which is about my speed) on biofuels it makes a little more sense.

Expand full comment

Great points. This comment made me change my vote on the survey from fascinated to uncomfortable.

Expand full comment

I mean, it *is* fascinating, and I'd love to enjoy it free of the knowledge of what it's likely to be used for. Technology has so much promise and implies so much about our potential as a species, and yet as a species, often we're still no farther along morally than we were when we climbed down from the trees. Too much of our technological progress is just a more elaborate way of forging chains for each other.

Expand full comment
Feb 11Liked by David Kingsley, PhD

How much do you think a Neuralink should cost for the common citizen?

Expand full comment
author

I'm not sure if I think this version of Neuralink should be used for any regular person. But I have to imagine anything requiring neurosurgery is going to have a 6 figure price tag. Consider a few elements, the surgeon and surgery preparation, surgery execution, followups, computer interfaces, and chip.

Expand full comment