Archive for the ‘Opinion’ Category

A Conversation about Apps

February 1, 2012

I wrote this, literally, last year, and never published it.  Sorry it’s so late.

It started with a tweet from a friend (Siri can access Best Buy’s product catalog thru Wolfram Alpha), which encouraged me quite a bit after reading this downer (“Siri Is Apple’s Broken Promise“).

As is my wont, I responded, “I see this happening more and more (call it “app hyper linking”); I’m doing it more and more. Kids today will think no more of –”  And also, an example (Siri and Watson, or Cyc [or, now that I think of it, Wikipedia!]).

But I’m not sure Mark got that I only used Kinect as one “fer instance.”

Or maybe he did, and linked to another thought.  Then more:

“Trick is to figure out a way to get synergy from hi-res display, simultaneous site/app access and standard api for data exchange”

“I think we understand data interactions pretty well. The next breakthrough requires increased density and scale of visualization…”

I didn’t realize the full import of what that meant till now; I replied in a different direction.  Since we were talking (initially) about Siri, and I still think there’s some future to voice UI.

Then Mark added more, different thoughts (for him, the main UI of the most interesting UI of ‘the future’ is Kinect).

But I got stuck on the data thing.  Was I wrong?  I added a ref. to a story I never fully read, until now.  Here’s the conclusion from that article:

The field of astronomy is starting to generate more data than can be managed, served and processed by current techniques. This paper has outlined practices for developing next-generation tools and techniques for surviving this data tsunami, including rigorous evaluation of new technologies, partnerships between astronomers and computer scientists, and training of scientists in high-end software engineering engineering skills.

But back to what we were saying… what were we saying?

I started on the subject of “app-linking.”  The best example I have, simple though it be, is a link in my (work) e-mail to a WebEx.  From the mail app, I click the link, which switches to Safari, which somehow recognizes this is not a normal web page and launches the WebEx app.  People have complained of the lack of integration in Siri (I expect more from Apple in the months and years to come).  I remember the first time I had a full multi-OS/multi-platform experience: an old Mac iBook, running Linux, running VMWare, connected to a Windows virtual machine (it looked like Windows, felt like Linux, and had the shell of Apple).  I think this type of cross-ness will only increase, to the point where kids today won’t give a hoot (most of the time) about what “platform” their on.  Except maybe the specs of the display — and then only to figure out how to configure three “monitors” (including a wall-projector and/or a multi-panel configured LCD), and a wireless mic/headset to a back-room server of ginormous proportions [running Windows, Mac and Linux respectively] — and “apps” running in the monitor itself, communicating flawlessly with “traditional” applications, and data-manipulating back-ends software I haven’t even thought of, yet.

Our other main tangent was UIs — I like voice, Mark favors Kinect.  I think the computer from Star Trek is not that far away; but gesture-based interaction is coming nearer, too.


What happened at my latest assignment

June 12, 2011

I’ll get back to tech. writing soon, but first my last thoughts about my ‘old’ job:

I’m writing separately about my experience with IBM in general, and here specifically about my last assignment.
Before I even showed up at my new assignment, I told them that I had already scheduled a week of vacation (first week in March) — plane tickets bought and everything, and I made no secret of that. I arrived to find that I did not have any access — either to the building itself, nor to the computer network; it took over two weeks to get everything installed and access granted (less time for the physical access, but coming to my desk did not permit me to get any work done, only attend meetings). While I was on vacation, I got an e-mail that requirements were due for my new assignment that day — not only did I get the e-mail while I was known to be out of the office, the e-mail was sent on the due date, not before.

Well, I worked as I could, learning new technology, as well as a new environment — new people, new processes (or lack thereof), new red tape…

Here is the bragging section: I started with minimal knowledge of some of these, and by the end, I had written a new method for a web service, using a framework I was unfamiliar with, communicating with the database via Hibernate (another first, for me), and also modified the client application to call the web service, and changed the UI, using JGoodies (new — though how much use of that is open for debate) — I worked on almost every layer of this n-tier architecture, Java application, using Java version 1.5 on the client, and only 1.4 on the server, running on Websphere 6.1 (new), coding in RAD 7.0.1. And the entire infrastructure was a mystery to me — I’ve never seen a diagram, yet. All the while, I navigated, a new social and political landscape — sometimes more successfully, sometimes less.

When it came to the Thursday before the due date, I had made clear that things weren’t all right, and finally let it be known clearly that the project wouldn’t be done this week, even to the point of taking the (personal) initiative to call the project manager’s boss on the phone and say, “this isn’t feasible.” So I was asked, “well, could you work the weekend.” (Note there is no question mark on that sentence — it wasn’t really a question.) Not knowing any of the players, and without any offered assistance, without communication with the people who needed to approve, that would be difficult. Arrangements were made to make two people available, and I was led to believe a third, who had key knowledge, but when I called him, repeatedly, on Saturday, he told me in no uncertain terms that he couldn’t help. By five o’clock Saturday, I’d accomplished more than I thought possible, but still called the final contact and said, “here’s where things are, and here is where they will stay — I won’t work on Sunday, and we’ll finish next week.”

It still took until Tuesday, but it was done. Then some support was given while the project was promoted, and for the next month after that, I sat and did nothing, because there was no work to do… So explain to me again why it was so important to do that last job in a rush??

That’s the most I’m going to complain, except for this — last week, I’m told there was an “all-hands” meeting (to which I wasn’t invited, and it wasn’t just me — there was at least one other person); one of the areas discussed was communication, but the meeting wasn’t communicated to everyone. [You can put your own emoticon, or exclamation here.]

Contrast that with starting my new job: I got two e-mail msgs the week before letting me know where to go and who would have my new laptop, and my initial account set-up. There was still a snafu along the way — isn’t there always? But requests for access were granted in short order and I was contributing and productive by the 3rd day.

Good people along the way

May 18, 2011

There have been many people who helped, who deserve special mention, and a couple of companies to reference, which I’d be happy to do business with again.  A surprise assist is due toYan Tsirklin, and I owe him thanks — be sure to check out SocialWish, and, if you need “unique and exclusive written articles created to your specifications”, check out Text Broker.
I’d gladly work for Steve Mueller anytime, anywhere.  Also Steve Musch is an excellent technical recruiter, who I was pleased to work with, although that opportunity did not work out (through no fault of his).  Ethan Scheetz, at Recruiters of Minnesota helped me, and almost landed me one job (I came in 2nd, due to myself, not Ethan!).

I want to thank my wife, for her patience, understanding, help and support, and un-ending love.

Intertech is another good company; I am happy to know people there, and maybe someday work with…  Steve Z. at TekSystems is a great guy that both my brother and I know, and I’d gladly recommend.  Another person known to both is Heidi Cline, recruiter at Concord USA — she was extremely kind and helpful.  I can’t say enough good about her.

Also, lest anyone think I only had “bad” experiences at IBM, there is one person I consider a mentor, who aided me both while there, and while leaving — thank you, Scott.

All in all, I consider myself blessed by the people I know, and who know me.

How I got here

May 18, 2011

In two weeks time, I will no longer be employed by IBM — I have been laid off, part of a “large” RIF (“reduction in force” — also a verb, as in “I’ve been RIF’d.”)  Here’s what happened:

Some of this I’ve already written, but here is more detail.  I’m posting this now because I’ve made a decision.

The story really starts back when when I joined IBM.  Before I was hired, there was a friend who suggested it was not the best place to work — they had their own story of administrative nightmare.  Well, it took the company almost six weeks* to decide if they wanted to hire me or not.  Three “managers” at BlueCross BlueShield of Minnesota suggested, strongly, that I apply, and they would put a good word in for me — for which I thank them greatly.  I interviewed, and was then told, by a different person in HR that I wasn’t accepted, but no details were given, although I asked.  So I contacted the HR rep. that I initially spoke to, who seemed mystified that there was no info, but also quite certain that I should’ve known the new person was now the point of contact; however, he agreed to look into it and give me some idea of what wasn’t right — and, to his credit, he did call me back and say, “wait a minute, we’re reconsidering….”  So although I continued to search, I did wait, and eventually, quite late, found out that I would be hired by IBM.

* I am not sure of the length of time.  I do know another person who was in a similar, but worse, situation.

It was quite interesting to leave BCBS MN, take two weeks off, go to training in Washington, D.C., and return to the very same desk, now working for a different company, doing the same thing I was…

Fast forward to February this year — that first week, I’d had a conversation with my manager to discuss my next assignment, and by Wednesday he’d e-mailed the previous project manager to find out when I’d be finished.  The response on Thursday was “Greg was done working on this project as of Tuesday.”  Consequently, I spent the next week both finishing up and sending documentation and notices to people, and also trying to secure a new assignment; contact for the next client, however, was initiated by an IBM manager there, and I reported to my new work site the next week.

Then, I was told at the end of the month, after only two weeks at the new assignment, that I was being laid off.  And I was given a month notice, the week before I was on PTO for a week.

This post is not intended as sour grapes!  I’m not overly complaining about my state, just reporting the interesting and sometimes humorous developments along the way.

What else happened at that client is another story… [this link doesn’t work, yet]

I returned from vacation on a Friday, and updated my status on LinkedIn, and sent a couple of e-mails.  By Tuesday the next week, I had three opportunities — one of which didn’t pan out because the process was too far along (another candidate was offered the position before the end of the week), one because the HR person did not want to deal with a (potential) “do not compete” clause (there was none to speak of, but that is his loss).  One of which eventually led to an offer, which I’ll accept and start presently.

That started at the end of February, and I was told my last day would be March 28th.  I asked if I could appeal, and was told, “yes, but it probably won’t work.”  Others told me similarly, that IBM was less and less approving appeals.  But in spite of many people saying so, I did get a two month extension.  Now, my last day is May 31st.
The point is, just like when I started, as well as at the end, communication within a huge company is rife with mistakes.  To the point that, for three months, administratively, one branch of IBM did not know where I was, while I was (successfully, appropriately) billing my time to my latest client — the resource/assignment e-mails kept suggesting I look for a new assignment, because I was, or would be finished with the current one very soon (!!).

In all this, there are some people I want to recognize, and thank.

Functional Programming in the News

July 27, 2010

Once again, the news leads my way.

First, it seems that chip designers had no choice but to create multi-core processors — even though it was a risk, even with all the unknowns.

And now Apple is announcing a twelve core machine.

And second, the best way to deal with this situation is not threads, but functional programming — not mentioned are OCaml, Haskell and others, but Clojure and Erlang get the nod of approval from Tim Bray.

So I’m still on the right track, or others are now catching up.

I’d like to hear other people’s experience with functional programming.  How did you come to it, and what are the results?

Respectfully Disagreeing

March 18, 2010

A blog, which I read sometimes, and agree with somewhat, and before today would have recommended “without qualification” now comes with a caveat.  All because of a quote in a guest post (yes, I know the disclaimer is usually, “the views expressed … are not necessarily [my own]”…  so consider this my reply to that author, not the primary blogger; still, it needs to be said):

“Nothing in biology makes sense except in the light of evolution.” -Theodosius Dobzhansky

For reference, Mr. Dobzhansky was born over one hundred years ago.  Today, I find that statement almost humorous, given what we know about “junk DNA” (which is anything but junk), to name just one area.  See, for example, this article, which shows the silliness of comparing DNA with chimps, or others (read the first two paragraphs, then just jump down to the 3rd paragraph  of the Paradoxes section:

… More recently researchers have turned up a pea aphid with 34,600 genes and a water flea with 39,000 genes. If genes account for our complexity and make us what we are — well, not even the “chimps are human” advocates were ready to set themselves on the same scale with a water flea.

There was some research a while back (several years, if I recall), which basically found the root of some disease to be located precisely in the non-coding DNA (the correct term, now, for was as previously called “junk DNA”), and this link was missed primarily because of the bias in genetics toward evolutionary theory and against looking at that odd stuff — some 98 percent of our DNA is non-coding, and the ratio is higher for “higher life forms” than for lower organisms.  Unfortunately, I cannot find a link to that, but I did find an excellent article by William Dembski, now ten years old (the article, not Mr. Dembski), defending Intelligent Design and, more specifically, his own work and the foundations of it, as well as some consequences of it.

I was led to this:
Amato, I. 1992. Deoxyribonucleic acid: the chemical inside the nucleus of a cell that carries the genetic instructions for making living organisms. DNA shows unexplained patterns writ large. Science 257: 747.

All that to say: I couldn’t disagree more with the statement quoted (and some of the logical inferences of that in the blog post); at the same time, I will maintain that we human beings are biological, and respond in a purely biological (read: materialistic, animal) fashion at times, however, I deny the root of the similarity to be evolutionary, and the similarities cannot be purely reductionist.  We are, ultimately, uniquely, human, and that is more than physical.

To quote: “I cannot help but believe that [being human] means that … we are responsible, and we are free; that we are responsible to be free.” (From the intro to an Rich Mullins song).

Not Just a Software Engineer

October 27, 2009

While my official job title is Senior Software Engineer, many times I’m also:

  • detective, historian,
  • logician,
  • aesthetic judge and grammarian,
  • and salesman and/or marketeer

There are other roles I play, as well.  Sometimes as maintainer of existing software, less often as initial designer, but still…

Although it is not, in the general sense, productive to figure out “whodunnit,” it is sometimes necessary and useful to find out who did the original writing of this piece of software, or who was the last person to change it, in order to answer why was this done — there may be a reason, which is important to preserve, and not simply undo the effect.

Frequently, at least in my work, the answer to the preceding involves digging thru history, recreating the past — fortunately, modern software development entails using a  version control system, which tracks who changes what, when.  Just recently it was the case, not uncommon, that the problem was not caused by any particular change, but the cumulative effect of several changes — and those are not always related!

In some situations, more often than not long after deployment to a production environment, the process of debugging involves a strange mixture of “reading tea-leaves” and guessing, based on extrinsic information (by “reading tea-leaves” I mean guessing about the current state of affairs by viewing only partial, random, sometimes seemingly meaningless bits of information).  Many times it becomes necessary to deduce the scenario causing the problem based on contextual information and situational data, and interpolating the flow of a program with incomplete knowledge — then it is almost like solving a sudoku puzzle: “the actions taken in the intervening time (where we have no direct knowledge) must have been thus because the information at the beginning and the results indicate that this is the only possibility.”  (“The number in this square must be six, because that is the only remaining value for this row/column and square of nine.”)

Let me add here two personal notes: first, none of this is a shocking revelation — not to those who do this work regularly; secondly, this is sometimes the interesting and exciting part of my work — sometimes, decidedly not, but at times, historian/logician is an exciting occupation of my time!

There are times — more frequently during initial development of new software — when the right choice of various competing alternatives, is determined not by logic alone, but by what “looks pretty” — at least in some sense, what is aesthetically pleasing — or what will, in the long run, prove “easier.”  The reasoning may run like this, “yeah, you could do it that way, but no one looking at this later will find it intelligible.”  Sometimes like grammar rules, the justification for a choice is, “that way is the one which flows easily and nicely, and according to tradition (not always a bad thing).”  In the past, more than today, I rebelled against this idea: “why did I spent years learning how to do this [complex technique, or obscure solution] only to continue coding at a ‘high-school level’?”  But age and experience have softened this: someone else will come after me and, if I make things difficult, curse my work.

Finally, salesman and/or marketeer: none to infrequently, I need to sell my idea to others — persuasive skills are usually required, be it a fellow developer, or a business user, a client or other co-worker.  I find myself more often than not more than just presenter of options, but also advocate.

But here’s my point in all this: they don’t teach you this in school! Computer Science education almost never covers most of the debugging skills you need almost immediately when you enter the work-a-day world.  My high-school and college did little to prepare me for a career in convincing other people to my point of view — oh, I did get that education, but not formally in a classroom (save one experience in a speech class, and that not very well!).

No one told me, when I was learning any programming language, that there is more to it than actually designing and coding — you have to pitch your ideas to others, work with people, and work after people; we never did anything even close to that in any class I can name — with people, once, my senior year, but after other programmers, doing maintenance and debugging?  Not at all!

So I’d like to hear your experiences, esp. other computer programmers, but any engineers, or, for that matter, others in any field who happen to read this.  Please give me some feedback; I may incorporate your comments here, and expand on what I’ve said.

Responding to “Sams Freedom …,” responding to Valarie Stevens

March 27, 2009

So someone I follow on twitter put up a lengthy blog post of recommendations for Twitter and blogging and got an interesting comment to the effect “you’ll never make money putting good info like this for free on the Internet.”  (Of course, I’m paraphrasing and, of course, I’m biased!  That’s the reason for this post.)

And I couldn’t fit my reply in 140 characters (hence this blog!).

But I couldn’t disagree more.  Sure you may be able to make money by selling this information, but I wouldn’t pay more than $1 for it as good as it is, and most likely not even that.  While I personally wouldn’t buy it, a lot of people may.  Some may even think they got a bargain.  But that wouldn’t add to your circle, and it would limit not increase your influence.

Bear in mind I come from the software world, where open-source is my friend: software that does so much, including operating systems for free.  Of course you should give back, and of course there is more to the story, but the line is this:

We’re talking about the Internet here, where Wikipedia makes the knowledge of the whole world available for free.  And if Web 2.0 is about anything, it’s a giving and sharing community.  There are ways to profit legitimately (selling goods and services, etc.), and then there’s the make-a-buck-quick anyway you can people.

Here’s my bottom line.  I think the blog post in question is completely apropos, and maybe it could be sold, but then a lot of people who won’t pay would miss out.  As it stands, the Valarie has my good will and more likely to get a recommendation from me, while a salesperson would not have either….

Expect this to be udpated in response to comments I get (or don’t).

If You Wonder about Jay Leno

February 17, 2009

Because you may be curious (I was), here’s the news about Jay Leno.

The NYT reports from December that Conan will take over in May (not what I’m seeing in current ads  [read below –ed.]), and Jay will move to a ten o’clock show (Eastern Time) and compete with CSI (Miami and New York).  There is more rumor than information in this story.

The Seattle Times reported Sunday more news, but contradictory information from the network and from Jay himself: a longer monologue, or not; a different format and a tough adjustment ratings-wise.  I half agree with the final quote:

“The key is never become a personality, always try to have material, always try to have something funny,” Leno said. “Never assume anybody wants to see you. They want to see you do whatever it is you do, as opposed to just showing up.”

Here is a source for a brief, readable interview with Conan: Women on the Web.  (This makes him sound the most likeable I’ve ever heard — not that I plan on becoming a fan!  Decidedly not.)

Finally, this story again says that Leno will be hosting the Tonight Show until May.

The whole story: I see here (Conan will be in Charlotte soon, if not recently), that Conan will sign off of Late Night this week, but not start the Tonight Show until June.  Okay, I understand now.

The case against “The case against Web Apps”

January 30, 2009

A friend put this link up on plurk yesterday, and I just had to disagree.

I think Web apps are resonable and responsible development, and I wanted to respond point-by-point.

1. It’s client-server all over again

Well, yeah, if all you do is on the server only.  But why not distribute the work?  Put the common work in a common location.  Yeah, your laptop may have loads of power, but the server has more, and access to more data, which you really don’t want to distribute it everywhere (or replicate umpteen times)?  Update: By the way, today (3 Feb. ’09) I read a Gartner report from 4 Aug. 2008 which says that Server-Based Computing is cheaper than PCs by 12 to 27%, and has other advantages (according to IDG Connect e-mail I get regularly) — yes, it’s by Citrix, so there’s ulterior motives, but Gartner nonetheless.

The other answer to this complaint is: yeah, so?  It’s not like the client-server model has been completely ‘disproved.’

I should point out that (a) I’m a senior software engineer, and (b) I don’t have a compiler installed on my laptop!  I don’t need one.  All my work is done via web browser.  Seriously, my development tools are MS Word, Visio and IE — and occasionally Excel.  So I think the case against Web apps is way wrong.  Of course, I’m only one perspective, but I think that the generalization I can make is valid, because my work spans two industries and different companies (just check who is using PegaSystems’ PRPC [Pega Rules Process Commander]).

2. Web UIs are a mess

Yeah, they can be.  What UI isn’t? or, is there a UI that can’t be a mess, or can’t be cleaned up?

3. Brower tech. are too limiting

This may be true, but I think Google would disagree.  Besides, are we really limited to browser?  Maybe we are now, but not for long if people continue to develop software there, and all forms of UI change over time (no paradigm lasts forever).

4. The big vendors call the shots

If they follow some standard, I don’t care.  This may be my weakest argument, but what form doesn’t that apply to?  Tell me that Microsoft doesn’t control the Windows UI, or Apple the Mac?  Have you seen the outcomes for applications that break with existing norms in any arena?  not pretty.

5. Should every employee have a browser?

You’re kidding, right?  What century do you live in?