The new buzzword is as overhyped as the old buzzword

In the Guardian, Sean Monahan offers a skeptical take on the metaverse. That’s not a term I heard before, but it was coined by Neal Stephenson in 1992— evidently because he wanted a synonym for cyberspace that didn’t sound like too much of a William Gibson knockoff.

Mark Zuckerberg evidently thinks that the metaverse is the next big thing. Discussing an interview with Zuckerberg, Casey Newton at the Verge offers a parenthetical joke: “The metaverse being unavailable to us at press time, we used Zoom.”

So far as I can tell, though, the metaverse is just supposed to be an on-line there which has the structure of a social and practical space. Thinking that will be a mind-splitting development misses the fact that we’ve had it for decades already. Back in 2000, I argued that internet chat rooms created virtual social spaces that were largely independent of physical space.

Continue reading “The new buzzword is as overhyped as the old buzzword”

The future as it looked from the past

I have a book called Values and the Future on my shelf which I take down and read short passages from occasionally. In one article, Theodore J. Gordon offers “Forecasts of certain technological developments and their potential social consequences” from his vantage point in 1969. The prospect of wide-band communication systems suggests these possible consequences for education:1

Ready and cheap availability of excellent curricula… might make education a respected and common pastime.

Canned lectures by eminent professors may make TV teaching superior to that in resident institutions.

University degrees will be extended to viewers who complete their courses solely on TV. Residency requirements may disappear.

Cyberpunk ambitopia

When I got my first iPhone, I wrote that its “compressed functionality underscores the extent to which the internet has changed things. If you had told me about it when I was a kid, I would not have been able to wrap my head around it.” It’s a camera, a calendar, an address book, a pocket watch, a GPS. It also takes calls, although I use it for text messaging more than voice.

When I imagined future technology as a kid, I often imagined smart houses. There was recently an on-line ad targeted to me for a front door lock that you can control from your phone. This is like the computerized houses of my elementary-school imagination. I should be excited, but I’m not.

The future has gritty problems that 1980s cyberpunk novels didn’t prepare me for.

Continue reading “Cyberpunk ambitopia”

Sponsored links are the new spam

educate Washington

Steven Frank drew the webcomic Spamusement from 2004 to 2007. The schtick was “Poorly-drawn cartoons inspired by actual spam subject lines!”

It was a genius idea. Frank encouraged other people to draw their own, based on spam they’d received. Back in the day, I drew about a dozen. Drawing them was a pleasant kind of mental palate cleanser, doodling that was tethered loosely to the verbal part of my brain.1

Continue reading “Sponsored links are the new spam”

E-publishing boondoggle

Via Daily Nous, I learn that Pacific Philosophical Quarterly (PPQ) has begun offering an odd choice to authors. When a paper is accepted, the author can opt either to have their paper appear post haste in an on-line only issue or to wait years for their paper to appear in a print issue. Articles in the print issue will appear on-line at the time of publication.

The publisher insists that the on-line only issues and the print+on-line issues will be of the same prestige and significance. After all, a paper is  accepted for publication before being assigned to one or the other.

Continue reading “E-publishing boondoggle”

Tweets point nowhere

Mark Simonson’s blog got me thinking about information technology and the original aspirations of hypertext. Simonson laments that current technology is too much driven by concepts taken from print media. Part of the problem is the lack of a clearly defined alternative. Ted Nelson, who coined the word “hypertext”, had a vision of multiple texts floating on-screen with lines connecting points in one to points in another. I don’t see how that wouldn’t end up like items on a cork board linked by lengths of yarn, the idiom for madness from A Beautiful Mind which has become Hollywood shorthand for crazy conspiracy theories.

Old school blogging actually seems like a pretty good realization of hypertext. Good blog post take a while to write because you’ve got to provide pointers so that someone who hasn’t got context or who is curious can follow up. Someone who wants even more can search on key terms.

All of this crystallized for me what I don’t like about Twitter. In order to cut a thought down to Tweet length, people leave out context. What are they enraged about? What’s the thrust that drew their clever riposte? I can’t always tell.

Sometimes thoughts that won’t fit into a single tweet are written as a stream, possibly with numbered entries 1/9, 2/9,… I see entry 4 of 9 because someone reweeted it, and it’s a serious investment of effort just to view the original series in order. Even then, I can’t always suss out the context.

Twitter, in short, is hypotext. It eschews the links of hypertext but also the context you’d expect from a letter or newspaper article.

Part of the shift is that many people go on-line primarily with phones or tablets, appliances that are great for scrolling and clicking but bad for following multiple threads. Twitter and Facebook turn our feeds into one-dimensional things. We can scroll through, liking and reposting as we go. But reposting just drops another log somewhere into the flume.