Tech Career Nostalgia
I admit I am a pack rat at heart. Perhaps it’s because I was exposed to prairie grandparents who lived through the Great Depression when I was at an impressionable age. Or maybe it’s just a character flaw. Nevertheless, at the end of my 25-year technology career late in 2002, when “my services were no longer required”, I walked away with copies of a variety of software and data I had produced. On at least two occasions over the first few years after that, I got e-mail or a phone call from some of my former bosses who asked, “Is there any chance you have a copy of that wonderfully useful test software you wrote, because we desperately need it and the acquiring company discarded their copies.” I treat that as “thank god you stole it”, so I have no concerns about these particular intellectual property rights. (The editors may disagree.)
Within the last few months, after reading our own fillyjonk’s complaints about doing online classes, as well as seeing similar complaints elsewhere and suffering through Grandparents Day visiting my granddaughter’s classroom by Zoom, I went hunting through my old backup media. I was looking for the stuff I had done in the period from 1993-1995 that provided multimedia capabilities I thought then would be useful for seminars or small conferences conducted over the emerging commercial internet. Working off and on for the last couple of weeks I got players for my three principle media working1. I am largely disappointed in how little progress seems to have been made in 25+ years. That should be like a century and a half in internet time.
Consider the state of the world near the end of 1993. The abomination called Video for Windows was available but regularly crashed any machine it was running on. The first generation Intel Pentium chips were available but you were much more likely to have a ’486 box running at 50-66 Mhz. With Linux running on that sort of vanilla hardware I had roughly AM radio quality audio and the world’s ugliest video 2. It was literally black and white video – black dots and white dots and nothing in between. But it moved! (Like the Sundance Kid, it’s more impressive when it moves.) Fifteen frames per second, so you could read body language and tell the audio and video were synchronized. And I could shove it around the corporate data network, from my office in Boulder to offices in Phoenix or Seattle or Minneapolis3.
Early in 1994 I added my version of an overhead transparency projector medium to the kit – vugraphs, as I called them. Again, the basic image was just black and white dots. But with the ability to scribble in red on top of that, and a movable shared pointer (the red square), lots of things were possible. For still images it was obvious I could do better than just black and white, but I was more interested in the “sharing” aspects at that point. At one point at least six people could share the vugraph medium, putting up base images, scribbling with red lines (and boxes and circles and limited text), and fighting over the pointer. I know it worked with six people because a colleague came and got me one day so I could look over his shoulder while six people – three in Boulder and three in Minneapolis – used two instances, one for source code and one for results, while they argued about who had screwed up.
By sometime later in 1994 the powers that be insisted I add grayscale video (again, it looks better when it moves). I admit it’s “prettier” in some sense than the black and white dots, but it’s unclear if it was more useful. A couple of years later than this another research group in the company conducted some trials with some universities in Oregon using very expensive workstations and very expensive fiber optic links. The little video windows in those trials, based on someone else’s opinion of what was important, were full color and 24 frames per second. One of the bottom lines in the final report from these trials was, “Given good still images and good audio, video is quickly relegated to body language signaling.” Body language is important4.
A few years later when color frame grabbers had become more-or-less affordable and processors were faster, some of my prototype applications were still in use for demonstrations and the powers demanded color video. I did a quick and dirty extension to the grayscale but my heart wasn’t in it. I’d lost the war. IP multicast was/is a key element to doing shared media well and the company wasn’t going to implement it. (Nor do the commercial IP providers seem likely to support it in my lifetime.) Services were going to be dominated by software but the company wasn’t interested in being a software company. We could have given away quite reasonable multimedia meeting software as part of the basic service arrangement pre-2000.
So. Anyone else want to share career nostalgia?
- Reconstructing this may be one of the dumber things I’ve ever done. It certainly has the potential to be a major time sink if I put together the ability to record things.
- There wasn’t a chance in hell of doing video other than the very limited VfW media on a Windows box. The necessary preemptive multitasking was still a couple of years away in Windows 95. They did unspeakable things to the rest of the operating system to get their very specific model of video to work at all in 1993
- Black and white dots met several requirements. At that time, the only display mode you could assume would be available on all the corporation’s desktop computers was black-and-something dots. The computational effort of the decoding algorithm I developed was very low so a ‘486 box could show three or four independently moving little video windows. And some expert had told me I couldn’t implement error diffusion dithering that only encoded partial frames – nothing I liked better at the time than proving some expert wrong. Black and white dots are limiting, but there are a lot of cases where color or even grayscale are not absolutely necessary. She’s still pretty rendered in black and white dots
- As almost everyone notices, the standard arrangement with a camera in the laptop or phone bezel or clipped to the top of a monitor, that provides a nose-centric view of the person, almost completely precludes body language signaling. I claim this is a really serious flaw and nothing is going to fix it except cameras that catch the person from at least the rib cage up (plus relatively free arm waving). Michael Siegel’s Video Throughputs series gets it (mostly).{/efn_note] but the one-bit 15 frames per second video transmitted body language quite well
I’m always amused at the application advocates for tools that they claim are superior simply because they have already invested the time to fully learn the tool.
Like vi or emacs. The only people I know who love vi are the people who know it inside and out, and thus are very efficient with it. I admit it has its uses, but I’ll take a GUI text editor any day. Well, except for the GUi version of emacs, which is just as unwieldy as the original, unless you already know emacs wellReport
That would be me using vi. After 40 years of regular use, my fingers just know it. If you asked me what the keystrokes are to do x-y-z, I probably can’t tell you. But my fingers know.
But that’s just for monospaced flat text (personal code, and to some extent first drafts of things done in HTML). If I’m writing, as in fiction or academic non-fiction, I use a GUI word processor that includes at least basic formatting and spell checking (not correction, just checking). And very strangely, except for tables. I find it easier to use groff and tbl to produce an attractive table, then screen-grab it at high resolution and paste it in as a picture. I can make a table in 15 minutes that way that takes me an hour futzing with a word processor or spreadsheet to get reasonable formatting.Report
One of my career personal highlights was being in an old disused lab and the CSE had to get on an old disused system to check for an old disused program for the answer to a problem that he was working on upstairs. He needed to edit a file and the system didn’t have nano and he, the genius who dang near built all of this stuff from the ground up, had me edit the file for him.
I got e-mail or a phone call from some of my former bosses who asked, “Is there any chance you have a copy of that wonderfully useful test software you wrote, because we desperately need it and the acquiring company discarded their copies.”
Ah, consulting.Report
I offered a quite reasonable hourly consulting rate on three different occasions (the two I mentioned and a call from legal who wanted help defending a patent application with my name on it). No takers.Report
Those bastards. They’ve got no problem dumping goodness knows how many dollars on outside consultants, they blanch when the money goes to their own guys.Report
It may have been more complicated. It was a period of massive merger and acquisition activity in the industry. When I was let go, I retired with a pretty handsome package of benefits. That package had accrued both from being a long-timer and in the form of retention incentives. One of the conditions the company imposed after the last acquisition was that people who had been released who hired back on would have to surrender/return many of those retirement goodies. It may have applied to consulting gigs as well, and the people I talked to were all bright enough to know I wasn’t going to hand back a couple hundred thousand dollars.Report
I can see that. In *THAT* case, I would have played the whole “When’s the last time somebody took you out to Casa Bonita and picked up the whole tab? The appetizers are out of this world! You know, if you happened to have an extra copy of such-and-such… we’d only use it in the one lab…” card.Report
Emacs is pretty great is you’re already a Lisp programmer, although Elisp is a bit of a franken-lisp. But still, a Lisp is a Lisp.
I have no idea why anyone else uses it.Report
First Java class I took, the TA wanted everything written using EMACS. I managed to figure out how to use it, finished the class, and promptly forgot everything I had learned about EMACS because Sun Forte existed.Report
When was this and what was Sun calling Sun Forte? I ask because I worked at Forte when Sun bought us, and for a while they were calling lots of stuff “Forte”.Report
I think it was originally called Forte for Java, and it was my IDE of choice back in the day (the day being 1996-ish). I don’t remember when Sun got involved.Report
Forte got involved with Java very late in our corporate life, and never produced a Java IDE of any quality. Forte for Java was Sun’s IDE, rebranded because they were initially under the odd impression that “Forte” had some cachet. That part didn’t last.Report
IIRC, Forte became NetBeans, or was supplanted by NetBeans, I was never really sure. I just remember starting out on Forte (we had a whole lab of Sun machines) and one other Java IDE (whose name escapes me) that the CS school had on the lab machines.Report
NetBeans was another company Sun brought about the same time they bought us; it was founded by a bunch of graduate student from Prague, which was a hotbed of software development around the year 2000. That sounds right, that Forte from Java was a rebranded NetBeans, and it shortly got un-rebranded.
I was visiting NetBeans in Prague during the 2000 election, and spent a lot of time discussing the situation and reassuring them that it would be resolved in the courts, not by violence. Remember back when we were sure about things like that?Report
So that’s what happened. I remember Forte being a brief flicker before NetBeans. I was still wrapping up my undergrad in 2000 and doing mostly IT work. Any software I wrote was little stuff for class or IT tools for work. It wasn’t until 2002 that I started writing code in earnest, and by then, Forte was gone and NetBeans was the thing.Report
Right tool for the right environment, I’d say. The few times that I had to do Windows GUI programming with buttons and windows and drop down lists etc Visual Studio was great.
But for most of my work over the years was developing backoffice processes running on Linux. I had a Windows laptop/desktop, but my C++ had to compile on the Linux build server. To work in a Windows IDE (JetBrains, CLion, or whatever) was annoying; for any compilation fix you had to scp or rsync your code over – some devs even had the horrible workflow of using git as sync so you actually had to push origin and pull on the remote for every missing semi colon. In comparison, logging in to the terminal and using vi or emacs was just much, much simpler.
You just want to check something in your source? grep it from the commandline! or less it! No need to wait 5 minutes for JetBrains to start up, download the latest versions of its plugins only to show you a simple text file. Since the compiled code wasn’t available, until recently the IDEs couldn’t even do the syntax checking and code completion that is the most useful feature of the IDEs.
So yeah, quite fluent in both emacs and vi (I’ve been emailing my .emacs file to myself on each employment change, which has not been all that frequent) and the finger combinations just come naturally, no brain overhead at all. Then you switch job and you get issued a macbook and everything just goes seriously pear shaped.
As for tech I miss? New job has everything running in the cloud. I feel miles away from my work, I can’t touch anything; it’s all so… cloudy?
I feel old.Report
All of my stuff, both Java & C++, has to run under Windows and Linux. Usually I develop under Linux until I’m happy with it, push to the Git repo, pull to Windows, then build and run my tests under Windows. It’s a bit of a pain, but it’s part of the requirements.Report
It really is a shame we don’t have global multicast over the internet. Given how much today is “multimedia” (and I haven’t actually that term in ages), we do ourselves no favor making everything point-to-point.
Except it isn’t really point-to-point. Its provider to CDN to CDN network embedded in ISPs to the last mile. The system works, but it creates a bunch of semi-monopolies for the CDNs.
(Full disclosure, my last job, about decade past, was with a major player in the CDN market. My current job is for a {bigtech} who can afford their own in-house CDN.)
I guess it would have required a lot of logic in the backbone routers, as not every stream will be popular everywhere, so not every router will need to receive the stream, so deciding how to route a multicast stream would require some kind of “global stream id” that the routers could negotiate. That is a fair bit of overhead. In other words, this is a different kind of problem than “multicast IP” running on level 2.
Still, I wish we had it. It would be better.Report
You would think with the advances we’ve made with learning AI, the overhead wouldn’t be that much.Report
…so deciding how to route a multicast stream would require some kind of “global stream id” that the routers could negotiate.
There’s a reason the IETF set aside one-sixteenth of the entire IPv4 address space for multicast. Experience back in the Mbone days suggested that routers wouldn’t be too heavily loaded as only a few in the backbone networks need to keep track of very many multicast trees, and the tree-pruning algorithms are good.
I understand that many CDNs are heavy multicast users.Report
I always seem to miss out of so called “golden ages” in employment. My first post-college job was teaching English in Japan. It was relatively easy and fun but it was well after the end of the bubble economy and there was always discussion about how much money people made doing the same job in the 1980s and maybe into the 1990s. A few years later, the company I worked for went bankrupt because of a consumer fraud class action.* There were expats who suddenly were not getting paid and being evicted from their apartments for non-payment of rent.** I suppose it is good luck that I missed that.
And then I graduated law school in 2011. I am doing okay but it took a while.
*English conversation schools in Japan were basically a joke. Most of the “teachers” were young adults in their 20s and we received a total of 3.5 days of pedagogical instruction before being sent out in the wild. Also the text books were not updated since the 1980s. Eventually a bunch of students sued to get their money back and the Japanese courts sided with them.
**The company I worked for arranged your housing and just deducted the rent from your paycheck. When the teachers stopped getting paid, so did the landlords and the landlords took it out on the teachers.Report
I don’t know if I ever worked in a “golden age” of employment — there were regular recessions and such, and my first gig at Bell Labs was just before the AT&T breakup. I did work for giant companies who weren’t (usually) looking at just the next quarter’s numbers, and whose top management realized they were technology-dependent even if not major technology creators, so tried not to whack the core forward-looking staff. I didn’t ever produce anything of large importance to the world, but I got to play in some interesting spaces.
The closest I ever came to any sort of world-changing opportunity was when I was finishing my MS and my housemate and I had gone to California for vacation for a couple of weeks. We stayed at his mother’s house in San Jose and ran around, often with a couple of women he had known as an undergraduate. At dinner one evening one said, “Oh, I told Dad about you and he says his company will offer you 20% more than your best offer.” Turns out he was head of the HR department at Intel and they were looking for people who knew network and combinatorial optimization.Report
All right. Footnote 4 has me wondering if I need to change my webcam confirguration because I do a whole lot of web conferencing these days. Will experiment with different places to put the camera.
The issue is I want my eyes to look like they’re looking directly at my viewer, to simulate eye contact. And I can’t position my camera in the middle of a monitor screen.Report
Yes, eye contact. I’ll bet you make fine eye contact with an individual juror from a distance of at least several, perhaps many feet. What you almost never do with that juror is crowd them to the point that you’re inside their personal space, nose-to-nose. You don’t get so close that you can’t gesture. The standard Zoom arrangement, which is not what they show in publicity stills, forces that kind of intrusion. I so wanted to go to my granddaughter’s class and put a line of tape on the floor and say, “Here is where you stop when you’re talking to Grandpa on camera. So I can see you wave your arms and be excited!”
Actors make eye contact with the camera without that nose to nose intimacy except when it’s appropriate. It’s almost never appropriate in a professional setting. I was learning it the hard way when I got to work with an interesting guy, now a CS professor at the U of Hawaii and then a former child actor (whose big credit was “I made Marsha Brady cry”). He set me straight on a whole bunch of things. The particular frames where I froze video are not necessarily the best ones for showing it, but I assure you I was playing to a camera several feet away, making eye contact with the audience.Report
A comment that few will see… I took granddaughter #1 bicycling this afternoon (Nov 7, Front Range Colorado, 70s and sunny). Afterwards I was chatting with my daughter. I mentioned that I was fooling about with this, although I didn’t mention it by name. She helped me set up a demo once, back in the days when Take Your Daughter to Work Day was still just daughters. “Oh, is MikeVision making a comeback?” she asked. “I was just thinking about it last week after a really sucky Zoom thing.”Report