Kelly’s Digital Autobiography

This personal digital narrative was inspired by Douglas Eyman’s introduction to Digital Rhetoric: Theory, Method, Practice and is my attempt to make sense of where my own life fits within digital culture change. I documented my process on my Miro mindmap.

A narrative timeline

1969

First breath

The day I was born, September 2, 1969, is the same day that two computers exchanged data for the first time, a day that many consider the first breath taken by the Internet. My mother had always commented that she was pregnant with me on her own birthday, July 20, 1969, when the first human walked on the moon, an achievement that was far more widely celebrated than the obscure technical milestone weeks later that would have a far greater impact. From that first computer connection came ARPAnet, and then the internet, which has grown geometriclaly in all the years that I was slowly learning to talk and read and write and think.   I learned this factoid years later when writing an internet timeline for an article (now 28 years in the past) when I thought digital technology was reaching a pinnacle and this felt meaningful. But you can’t see the peaks until you know where the valleys are. My personal narrative has those low points, but digital technology and connectivity has not stopped the climb it began on the coincidental day of my birth.

1979

Screen time

After dabbling in pong and handheld video football, my family’s big entry into the digital world was a Sears 2600 Atari-compatible gaming unit. My budget-minded family may not have joined the videogame trend if Atari hadn’t licensed its technology to a mass-market partner like Sears (where my richest uncle worked) and to other game developers, which seemed to outstrip Atari offerings. Activision River Raid was the favorite among my sisters and me, who otherwise spent time in the actual unnamed creek that ran through our yard.

 

While later I played Colecovision and Intellivision at friends’s houses, Atari was the beginning and end of gaming systems for the TV-obsessed Andrewses. With the set on day and night, we had no time for games.

 

Video hat tip to https://wearethemutants.com/.

1987

Late adopter

Brother word processor

I began my freshman year in college with a newly minted Brother — not a computer, but with a small screen that allowed me to type, save my work, edit minimally, and print. The first time I used a personal computer was a laptop in my senior year of high school English, and the word processor was a step up, since the laptop had a small LCD window that showed about 30 characters at a time.

 

To contrast, some of my wealthier or more technological classmates arrived with personal computers, technologically advanced CD players (the discs supposedly would not scratch), and brand new Nintendo game systems. I bought myself a CD boom box with my summer earnings and joined Columbia Record Club, but I stuck with my word processor, printing off papers minutes before class. My contemporary Douglas Eyman may have spent time in computer labs on Internet Relay Chat, but I was just happy that my word processor freed me from correctable typewriter ribbons and Liquid Paper.

 

Image source: https://www.iretron.com/

1992

Wax layout and dumb terminals

My first jobs post college were not-quite-digital. Unable to afford a job in New York publishing, I stayed in Fort Washington and secured a job amid the glitz of my hometown’s finest trade paper: Construction Equipment Guide. My first exposure to professional publishing involved running typeset text through a waxing machine, which coated the back of the paper with stripes of melted adhesive wax so it could be cut up and pasting it in layout form. I did not consider this digital, but Douglas Eyman cites Angela Haas, who wrote, “All writing is digital: digitalis in Latin, means “of or relating to the fingers or toes” or “a coding of information.” (Eyman, p. 19). If all writing is digital, cut and paste modular paste up qualifies as well.

 

My next job was a step back professionally but a step forward technicologically: I was a customer service rep for Adelphia Cable. I sat in front of a dumb terminal — a keyboard and black screen with green text linked to the customer mainframe — fielding complaints about pay-per-view Wrestlemania that didn’t come through or outages due to the county-wide upgrade to fiber optic cable that would eventually make high speed internet and streaming possible. Technology requires infrastructure — not just Steve Wozniak but the folks splicing the cable and dodging irate callers.

 

1993

Office space

My first real introduction to computing began in a publishing job. I landed an editorial assistant job at Saunders College Publishing — one of the least glamorous arms of Harcourt Brace, working on text books on developmental math (i.e., high-school-level math for college students who needed additional preparation). I used Word for DOS 5.0 for all of two or three months before moving to Windows — a giant leap forward for WYSIWYG interfaces — and my first email account.

 

While we at Saunders produced primitive educational math software on 3.5 inch discs, I still distributed massive manuscripts on paper for review where the photocopier, fax machine, and FedEx dropbox were both essential tools and nemeses. Early email accounts could not handle the file size required for manuscripts — the steps by which digital technologies were developed and were adopted were a stagger, leaping in one area and lurching in another to catch up.

 

 

1994

You’ve got mail

icarly memeMy first exposure to the internet came via my boyfriend’s father’s AOL account. I remember huddling around his desktop, looking up our college websites, and playing AOL trivia games.

 

Nicholas Carr wrote that “the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind.” (Nicholas Carr, “Is Google making us stupid?” The Atlantic, July/August 2008.) My first exposure was a revelation; little did I know that the AOL trivia game that hooked my Quizzo-obsessed boyfriend and I was the point of a spear that would change the meaning of trivia, where every barroom dispute can be resolved with a few keystrokes.

 

In my own walk-up apartment, I used a Mac Classic decommissioned by my old employer to write my grad school papers; my machine was too old to be compatible with dialup. Without Internet, it offered only one improvement over my old Brother word processor: my Microsoft Word writings could be transferred via disc to other computer and the files were forward-compatible.

 

1996

Will markup for food

Very little time elapsed between my first exposure to the web and my first professional experience developing HTML websites at my then-employer, The Partnership Group, a child- and elder-care referral service as a time when information was easier to obtain by phone (or phonebook) than via the Internet. I was an editorial specialist who wrote printed infosheets that our phone consultants sent out, and when the company first launched its website, I inflltrated the project, although I wasn’t on the original team.

 

By 1996 I had a job a professional web editor at a D-list magazine for entrepreneurs, deploying HTML, AdobePagemaker, discussion boards, and online banner advertising along with our external vendor. The jewel of my writing crown was an interview with Jeff Bezos just as Amazon went public. (I will never know why he agreed to speak with me.)

 

While I idealized The New Yorker as the pinnacle of writing, my personal ambitions had shifted to digital magazines Slate and Salon — hypertextual writing with its the connections and breadcrumb trails between topics and sources in an internet publication. Eyman cites Richard Landow’s definition of hypertext as the center of digital rhetoric: “By inserting the individual text into a network of other texts, this information medium
creates a new kind of textual entity—a metatext or hypermedia corpus.” (Eyman, p. 25) I applied to journalism school in hopes of closing the gap between where I was and the dynamic digital newsrooms in Silicon Valley and Silicon Alley.

1998

I came, I saw, iMac

original imacIn 1998 I took on a lucrative writing project that required me to purchase a original first-generation turquoise iMac, hooked up via Earthlink dialup. I used to the new tech to work for a dinosaur of a publishing model: collectible cards about the nutritional and culinary benefits of different ingredients delivered by a subscription service similar to recipe cards. It’s hard to believe that model existed in 1998, and I’m more shocked to discover it still exists in 2023! While it seemed archaic then, the cards were the kind of modular content that later lent itself to online publishing — a precursor of the food blogs and content creation across social media platforms.

 

Later in the year, I began a dead-tree job as senior editor of a print trade magazine about direct marketing. The discipline was in a state of disruption, making the transition from envelope stuffing and mailing lists to data collection and digital marketing. As paper catalogs began to lose primacy and print trade magazines along with them, I launched an email newsletter about digital marketing that reached 50,000 subscribers a week.

 

Image source: https://www.macworld.com/article/190458/original_imac.html

1999

Dotbubble

After writing about e-commerce, I wanted to be part of the dot-com boom. One of the early movers in the space was CDNOW, a music store begun by twin brothers Jason and Matthew Olim (who I happened to go to elementary school with) in Fort Washington (which happened to be where I went to high school). In 1999 they were already public, and the cracks in the model (reliance on dropshipping and thus low margins) were apparent, but I applied for a job of managing editor of content — the reviews and features that supported sales. I didn’t get the job at first — the VP thought I wasn’t “technical” enough — but when he left months later, I got hired.

 

Just the year before I had declined a spot in Cal Berkeley’s journalism school for personal reasons, but I had climbed out of trade mags to be in the middle of it all — internet, content, music. It didn’t last — Amazon entered and soon dominated the CD market, and Napster introduced filesharing, which was the beginning of the end. By 2001 CDNOW merged with BMG Direct — the competitor of the old Columbia House record club where I first stocked up my CD collection — and began the process of shutting down and laying off workers in Fort Washington. I had just accepted a transfer to our New York staff the day the Twin Towers came down, killing my boss’s brother and shutting down CDNOW’s Wall Street news office for months.

 

Sometimes being in the middle of it all isn’t where you want to be.

 

2002

EDU Redux

CDLATER, alligator. In mid 2002 I went back to school, leaving CDNOW/BMG for a safer haven: the Wharton School Publications Office as associate director. Once again, I was rejected for the job, getting it on the rebound after asking why I wasn’t selected and how I could be considered. (Part of the answer was taking such a serious paycut that I did not reach my CDNOW salary again for 20 years.)

 

“Publications” as the name of our office was an anachronism. I worked primarily on digital products, with our video-rich MBA marketing package distributed via CD, since streaming was not yet reliable in international markets. The project was considered so innovative that we won the Gold CASE Award — the highest honor in college marketing (there are awards for everything).

 

By 2003 we gauged that tech infrastructure was advanced enough that we able to ditch the CD and deliver virtually all admissions marketing online, using a custom CMS that became increasingly clunky, with the merest flyers mailed out to direct prospective students to the web.

 

When sourcing some photos to report on a student event, an undergraduate told me to grab them from Facebook, a new social website for the Ivy League. I submitted my upenn.edu address and began my account in May 2004, weeks after the site first expanded beyond Harvard. I mostly forgot about the account until about 2008 until it was widely adopted by my peers. This earliest social media experience is relevant to Douglas Eyman’s citation about Gunther Kress’s 2003 observation that text is “a matter of social action and social forces, and all aspects of literacy are seen as deriving from these actions and forces” (Eyman, p. 23). Without a relevant audience and a social and cultural environment, a social media platform is a dead end.

2006

Go Kelly go

I launched my first website about 2002 to post pictures of my garden using the URL www.gokellygo.com. I wasn’t a MySpace or LiveJournal or Blogger person and Instagram didn’t exist yet.

 

In 2006 the advertising agency for Ford Motor Company approached me to buy my UR , in support of an advertising campaign featuring the Kelly Clarkson single, “Go.” This was the tail end of the URL parking and resale market, and I asked a Wharton coworker who had profiteered what to charge. He told me to consider what my name and content were worth to me — did I want to move it? $10,000, we decided. Ford declined.

 

From the early days of the Internet, visionaries and bandwagoners have made money. I have never been one of them.

2009

Moderated and disintermediated

After my first child was born, I revisiting my print- and storytelling ambitions by taking on a role as Wharton Alumni Magazine, leading short- and long-form content for a quarterly print and digital publication. At the same time, I finally read Harry Potter, and as a mother and writer, I realized that the books of my childhood had inspired me more than literary fiction ever had, and in isolation, I began to write a middle-grade novel in isolation. When my second child was born, I took some time off full-time employment for unsteady freelance work.

 

I finished my novel, found an agent, AND got my first iphone so that I could stay connected to the agonizing silences of having a novel on submission. I had found online and real-life friends in gardening and parenting forums over the previous five years. but when I joined online writer boards and writer twitter, the difference between IRL and online friends disappeared. Some of my writer friends became famous and I became acquainted with some who already were.

 

My first novel never sold. My second did, to a small press that went out of business, then to another small press. I self-published a Kindle short story after being dropped from a traditional anthology, dipping a toe into the disintermediated world of online publishing.

 

Douglas Eyman describes the 2005 work of James Zappen focusing not just on rhetorical strategics and new media, but the formation of digital identifies and potential for building social communities (Eyman, p. 29). What he did not describe was the process of entropy that takes place in online communities with the distance and anonymity of digital identities. On the moderated board, our community fell apart with bitter recriminations, as all communities seem to. In the unmoderated one, a public persona became an expected but fraught thing amid writer blowups and kerfuffles over privilege, hubris, and identity that predated so-called “cancel culture.”

 

At the same time, as I became immersed in writer communities, I began reading more than ever. Nicholas Carr cited Scott Karp writing about online meaning, who confessed that reading on the web destroyed his voracious book consumption by changing his thinking (Carr). I experience the opposite, and part of it came via another social channel: Goodreads, a reader’s review site with annual Reading Challenges gamified my book shelves. I became determined to beat my annual target of 100 books, something I am this year falling behind for the first time in years (I’ll put some blame on this class!).

2014

Content is king, but I am not queen

While content is king, children’s books don’t pay. I returned to work at Wharton part-time, then full time at a far more junior level than I had left. The department had changed from Publications to Marketing Communications to Marketing Technology. My responsibilities — working as a marketing project manager, writing marketing copy, telling student stories — changed very little, but my title changed from writer/editor to content producer, manager, and eventually strategist.

 

As school’s social media accounts turned visual (Instagram, youtube), I followed reluctantly. I’d write a script or tell a visual story because it was required, but I’d personally rather read something than watch it.

2020

Winter of dis/content

Everyone remembers 2020; no one remembers 2020.

 

The COVID pandemic accelerated societal changes that seemed improbable or distant — removing barrirers to telework and telemedicine, while widening the digital divide between children and households those with broadband, devices, and safe homespaces.

 

I had been struggling in a new job at a philanthropic research center at Penn, but the pandemic gave new direction and meaning to my work. While my younger colleagues felt the pain of social disruption more, I felt purpose and connection with my family. After struggling to get past gatekeepers of children’s literature, despite selling four books, three of them without an agent, I paused. My children were teens. I left the digital spaces for writers and stopped posting in my accounts, amid culture clashes that pitted friend against friend against troll.

 

2023

No thank u next

Meaning doesn’t stay fixed; I have to keep constructing it With 10 to 20 years of work ahead, I’m worried. My career has existed on the margins of technology, and the margins have moved beyond me. TikTok, algorthims, AI, and AR are beyond my knowledge and even interest. 2005’s innovation, the podcast, continues to go strong and streaming content exploded, but I get impatient listening or watching when I could read much faster and understand much better. If I were beginning my career as a low-level writer and editor, ChatGPT might easily replace any trade-magazine copy I could produce. But I’m here closer to the end than the beginning, existing on the margins of marketing technology as digital strategy have move beyond me. I joined this course to help make sense of it and reposition myself as an digital actor and not an artifact. No one is future-proof, but the future is coming.