Friday, April 8, 2016

Nailing jelly to the wall: managing change in the 21st Century

How do you manage change? No matter what you do, change happens to all of us – probably more than we think.

There are the kinds of personal changes everyone experiences as kids and parents get older, our own hair gets a little greyer (and sometimes thinner) and people, animals and places come and go from our day to day lives. And then there are day-to-day external changes that we typically have no control over – such as traffic and weather.

I would hazard a guess that most of us manage these kinds of changes pretty well, as they are well-understood. Even when these changes are unpleasant - such as when there’s a big hailstorm or a traffic jam that makes you two hours late – you can use your life experience to handle that change.

And, as people, we all have resources that we employ when we’re experiencing difficult change. Friends, family, psychologists, doctors and people of faith are all typical go-to sources of support, ideas and comfort during those times.

So hold that thought – that vision of how people manage change. And now start thinking about companies and how they deal with change. And I suspect there’s something to be learned from one about the other.

MANAGING THE DEATH OF A PRODUCT

Bringing a new product to market is a tough business that involves a lot of blood, sweat and tears. But pretty much every product - no matter how smart, cool and amazing it is on the day it’s launched – has a lifespan.

For some products, that lifespan can be measured in decades. If you look at the flagship product of a company such as Coca Cola – which, despite many changes in marketing and formulation, still sells vast quantities of sugared water every year.

For technology companies, however, the life of a product can be measured in months. Some products barely make it off the drawing board and into manufacturing before the company that sells them pulls the plug. This is particularly true in highly-competitive, mass market technology products such as smartphones – where competitive advantage can disappear quickly and manufacturers are tracking data daily on the success or failure of their product.

Photo courtesy of Dell Inc.
 
I remember some years ago talking to Michael Dell, the CEO and founder of computer maker Dell. In a piece I originally wrote for The Financial Times, he talked about how his manufacturing model helped him gain insight more quickly about what products were succeeding or failing – but it did not make him immune to failure. Here’s an excerpt:

Mr Dell also said that the direct manufacturing model gave his company an unparalleled insight into the kinds of product areas that his company should move into – as well as those from which it should retreat.

‘One of the magic abilities of any great product company is to understand technologies and customer requirements and come up with the perfect combination to solve a problem’, said Mr Dell, although he admits that customers sometimes do not realise that they have a problem that technology can solve. ‘The customer is not likely to come and say they need a new metallic compound used in the construction of their notebook computer, but they may tell us they need a computer that is really light and rugged. Where some companies fall down is that they get enamoured with the idea of inventing things – and sometimes what they invent is not what people need.’
He said that by using the Dell model, he knows very quickly when a product isn’t going to sell. ‘When we launch a new product, we know within 48 hours whether or not it’s going to work’, he says. ‘We had a Web PC that didn’t turn out so well. But if you have no experiments, then you have no success. Occasionally, you just miss.’

PUTTING CUSTOMERS AT THE CORE

On another occasion, Michael Dell talked about his approach to managing companies in tough times. To set the scene, these comments were made in a speech 2003, when the economy was still very much recovering from the economic downturn between 2000 and 2002 (and the aftermath of 9/11).

Here’s some of what I wrote about it in IT World Canada (the full article is here: http://www.itworldcanada.com/article/simple-management-advice-for-tough-times/19971#ixzz44bP5nMOz )

“Dell said that while he recognized that we were in tough economic times, the fact that times are tough didn’t really change what he had to do: listen to customers and deliver products that offer great value. It sounds like a hokey and simplistic solution to a complex situation. But it seems to be working for Dell ...
Dell is making good money and growing its business at a time that all is supposed to be doom and gloom. In his speech, Dell talked about having built his original business on the simple proposition that both consumers and businesses shouldn’t have to pay thousands of dollars each for computers that really consisted of about $600 worth of parts per system.

He figured, correctly as it turned out, that if you cut out the overhead of a delivery channel and a number of other costs along the way, that you could still make a decent margin and deliver systems that people would be happy with. Dell also later figured out that if you did this directly via the Web and supported it with fast delivery times and good service, you would have much closer links with your customers and could count on repeat business.”

CHANGE IS CONSTANT

So, as with people in their own personal lives, the one constant in business is change. Things WILL change – to a greater or lesser degree – in most businesses from one year to the next.

One of the best things you can do to prepare your company to deal with such changes is to be really, really clear about what your assumptions are – and then have a laser focus on continually re-validating those assumptions. If any of your key assumptions prove false (and wildly optimistic), you’ll need to adjust your plans accordingly.

And most assumptions are never 100 per cent accurate – positively or negatively. That’s why many companies can end up with a huge glut of product they can’t sell OR end up having customers sign onto waiting lists because they can’t keep up with demand.

Tuesday, April 5, 2016

Remembering Andy Grove - Third and Final Part of the Series

So my final thoughts on Andy Grove came from something he said at an Intel conference I attended at the height of the “dot com” boom. I believe it was in February or March of 2000.

To set the scene, Intel had invited a number of journalists (including myself) to its head offices in Santa Clara to hear the latest on the company’s upcoming product releases. I won’t bother going into the detail of that here, but suffice it to say that – in the course of this briefing – Andy Grove came into the room to talk with us.
In the course of that talk, someone asked about Intel’s approach to making investments in early stage companies. To provide context, Intel had – at that time – made announcements about how it was investing in promising start-ups that it said could enhance the use of Intel’s chip technologies (and the ecosystem in which they were used).
The discussion came around to Intel’s criteria for selecting companies in which they would invest – and the words “exit strategy” came into the conversation. At that point, Andy Grove became very animated and made it abundantly clear that he would not look kindly on any company that approached Intel with an “exit strategy” already planned – so that early stage investors (and founders) could cash out early with a tidy windfall for relatively little work.
Sadly, I was not able to lay my hands on my own notes of that event – but it was not the only time that he clearly articulated this view. In a really excellent 2012 interview on NPR, Andy Grove and Intel co-founder Gordon Moore (yes, the guy after whom “Moore’s Law” is named) looked back on their career together and reflected on exit strategies and many other things.

 
"I really don't have much respect for the people who live their lives motivated by an exit strategy existing, being performed," Grove says in the NPR interview. "There was no option that we were trained in that says, 'If it gets too hard, get up and leave.' "
I would strongly suggest that if you’re curious about how Intel came about – or what made Andy Grove tick professionally – that you not only read the review, but also listen to the audio clips in the interview. It’s really good.
At any rate, what all this boils down to is Andy Grove’s advice: Build something that lasts and that matters. And that sounds like great advice for both life and business.

Monday, April 4, 2016

Remembering Andy Grove - Part 2

One of my most fun memories of talking to Intel pioneer Andy Grove took place in London - and I'm pretty sure it was a the Savoy Hotel - in 1991. He was fun, talkative and willing to participate on whatever creative ideas our photographer had for how he should pose.

I was interviewing him for a popular monthly UK computer magazine called Personal Computer World (the interview was later reprinted in a great book from Wendy M. Grossman called "Remembering the Future: Interviews from Personal Computer World").

Here are a few highlights:

Andrew Grove is not your typical computer industry mogul. Although he
is a co-founder of the largest and most successful microprocessor
company in the world - Intel - and has led the personal computer
industry by the nose for most of the industry's 15-year life, you wouldn't know
it to talk to him.


He greets you with a winning smile, a twinkle in his eye and a manner
which suggests that being president of Intel is just what he does for
a living - and that he really can't wait to get home, slip his shoes
off and relax. Unlike most CEOs of large US computer companies, Grove
does not appear a zealot for PC technology - or even for Intel
itself.


During our interview, he was as comfortable getting into the fairly
outlandish poses suggested by our entrepreneurial photographer as he
was in discussing his then-upcoming keynote speech at Comdex '91 in
Las Vegas.


But behind the easy manner and laid-back style is a self-made man in
the best traditions of American business. Grove was born in Budapest,
Hungary in 1936 and after making his way to the United States
graduated in 1960 from City College of New York with a Bachelor of
Science degree in chemical engineering. He took his Ph.D. at Berkeley
(near San Francisco) and landed his first job at Fairchild
Semiconductor.


And there Grove stayed until he was invited to help found Intel with
Robert M. Noyce and Gordon E. Moore in 1968 - with the goal of
develop a process to allow several thousands of transistors to be
integrated on a single chip of silicon with relatively high
production yields. Shortly after, the first Intel microprocessor was
born.


Since then, Intel has come to dominate the microprocessor industry
with PCs from every major manufacturer (except Apple) basing their
systems around Intel-architecture chips. Over the past two years,
however, Grove is facing increasing challenges to its position."


After that intro, I talked quite a bit about the technology and competitive landscape issues of the day (which were very top-of-mind then - as much as the Xeon processor show below is today).


Intel® Xeon® processor E5-2600 v4 CPU front view - photo courtesy of Intel
 
 
But the answers that really stand the test of time (and are actually fun to look back on) relate to his take on the future. Here's what he said about the way he thought networks would be used in the future (and remember, this conversation is taking place in late 1991).

The current ratio (of network staff to equipment) is one network
administrator per network. Although this is 'do-able' today, what if
you want to wire up 100 computers? Current average network sizes are
ten computers per network and at that rate it would mean 10 million
network administrators would be needed. It's like when someone in the
in 1940s extrapolated the size of the telephone network would
eventually require everyone who used a telephone to become a long
distance operator (and in the sense that most long distance calls are
now done by direct dialing, they are right).  So we will try to make
network management functions so transparent that administration will
just be one more function in any network application.  


One other interesting insight from Grove came in a question about Microsoft and its role in the PC industry - and his answer reflected the Microsoft's youth at the time.

It (Microsoft) is and has been the most productive developer of system software.
For most of this decade it has been a small company trying to do a
big job. It is not a criticism of Microsoft, but the amount of
resource and development that went into processor development was
not matched by that in systems software - and they are now catching
up.




One observation about this interview - it was one the only interview I ever did with a major technology company CEO that was followed up by a hand-signed note afterwards. When the interview appeared, Andy Grove wrote to me to tell me how much he had enjoyed it. He was a gentleman and a scholar.

More Andy Grove memories soon....

Sunday, April 3, 2016

Remembering Andy Grove - Part 1

Former Intel Corporation CEO Andrew S. Grove (known to many simply as Andy Grove) passed away last month at the age of 79. I had the good luck to interview him on a number of occasions - and always found him to be thoughtful, articulate and personable. I'm going to post a few pieces about the Andy Grove I remember - and will do it in a couple of parts.

Andrew Grove

This first one reprints a story I wrote for Canada's National Post newspaper back in 1999 when Intel has launched it's new "Pentium III" chip - and he reflected on some of the company's earlier mistakes (including a calculation bug in the original Pentium processor that Intel initially brushed off as something which wouldn't impact most people - and then had to urgently fix after a consumer and press backlash).

Here's the 1999 story:

Andy Grove doesn't admit easily his mistakes. Intel Corp.'s chairman took the company from near extinction in the mid-1980s to semiconductor super-star status today by knowing right from wrong decision.

But on the occasion of Intel's Pentium III launch, Grove is in a reflective mood. Perhaps it is because he has pulled back from day-to-day involvement in the company, ceding the CEO and president's job to Craig Barrett. Perhaps it is because he recently fought and won a very public battle with prostate cancer.

Mr. Grove now looks back on the famous "Pentium bug" fiasco in late 1994 and early 1995 with a self-critical eye and admits that he has learned from his mistakes.

Back then, the newly-introduced Intel Pentium processor was being heavily promoted to consumers as the chip that would really take their computers into the multimedia world - allowing them to use graphics, sound and video as never before. At the height of the 1994 Christmas sales season, a "bug" was discovered in the chip - a bug that would cause it to inaccurately complete a certain complex series of calculations.

For the majority of users, this bug would never intrude on the use of enjoyment of computers with Intel Pentium processors inside them. Intel said so - and offered only to help out those few consumers that it felt really needed help.

Consumer groups in the United States, however, took a different view and were shocked that Intel should presume to tell them which bugs were and were not a problem. "The customer is always right!," they told the company. Chastened, Intel eventually did move to widely distribute bug-fixes to the problem and Intel Pentium processors have enjoyed strong popularity ever since.

At the time, however, it was no sure bet that the Pentium would succeed. Intel competitors such as Advanced Micro Devices (AMD) had successfully produced “clones” of Intel’s earlier generations of microprocessors including the 80386 and 80486 and stood to gain if consumers backed away from the Pentium (which, at that point, had not been copied by anyone). Any perception that Intel’s future technology path was flawed in some way could have had a strong impact on the company’s market share and sent consumers looking for PCs with other manufacturer’s processors.

This was a very real fear – and one that Intel knew well. Back in the mid-1980s, after Intel enjoyed its first flush of success when its 8086 family of processors were used in the original IBM PC, the company came very close to going under.

Arch-competitor Motorola was enjoying huge popularity in Apple’s Macintosh, the Atari ST home computer, the Commodore Amiga and the Sinclair QL. And the notion of having desktop computers with the same company’s processor in them – generation after generation – had not yet taken hold.

Intel also had no guarantee that IBM would stick with its processor family and faced heavily competition from much larger Japanese manufacturers. The release of the 80386 in 1985 was really the turning point, after which Intel knew that it – not IBM – was in the driver’s seat. This was because the 80386 was a huge success and Compaq, not IBM, was the first manufacturer to ship systems based on the 80386 processor. And it enjoyed huge success as a result. IBM thereafter felt compelled to follow.

But we digress. The lesson from the Pentium bug experience came home to Mr. Grove in January when reports started to appear that Intel would be featuring a processor serial number (or PSN) on its new Pentium III computer processor. This electronic serial number would allow computer companies, corporate users of computers and operators of Web sites to use software to call up the serial number of the processor and keep track of which computers were doing what.

This time the outcry came from privacy and civil liberties groups concerned that processor serial numbers made it easier Web site operators to more easily track the buying patterns and interests of consumers.

This time, Intel reacted differently - working with both interest groups and the rest of the computer industry to come up with a solution to the problem even before the Pentium III came to market. So when Pentium III systems were ready to ship at the end of February, Intel was able to announce that it had developed a utility that would allow users to switch the processor serial number feature "off" if consumers did not want to have their systems automatically identified.

 "We are a lot less righteous about things than we were four years ago," says Grove, admitting that Intel redoubled its quality control efforts after the original Pentium bug problem and has become much more responsive to consumer-level concerns. "Having said that, no-one will ever built a perfect microprocessor," he adds. "In retrospect, the problem in 1994 was not the problem (with the chip) but the way we handled it and reacted to it."

 He is even philosophical about the company's current legal battle with the United States Federal Trade Commission (FTC), although he strongly denies that there is any merit to the notion that Intel has engaged in any unfair trade practices. The FTC has charged that by withholding products and products plans from Intergraph, Digital Equipment Corporation and Compaq Computer, Intel illegally used its "monopoly power" to force those companies to cease pursuing intellectual property claims they were then making against Intel. Intel denies any wrongdoing and says it was only taking appropriate steps to protect its intellectual property.

Saturday, April 2, 2016

Chasing the future


Everyone has their own vision of the future – and yet few of those visions actually align very closely with what reality eventually looks like.
Take the flying car as an example. It’s been a popular idea for most of the past 100 years – making appearances in every era. Not only did comic books and science fiction stories of the 1930s and 1940s often feature a flying car, but they were a staple mode of fictional transportation thereafter in everything from The Jetsons to Star Wars to Harry Potter.
And yet, it’s not something that we as a species have seen fit to put a lot of energy into. While there are lots of examples of real life flying cars (see this YouTube video and all the associated ones on this page), none has ever provided a compelling reason for consumers to go buy them in massive numbers.
There are lots of other reasons for this – including the additional skills required to “drive” in three dimensions and the often questionable safety and durability of the many prototype flying cars that have been produced throughout the last 100 years. And, of course, none of them have been particularly cheap or manufactured by mainstream car makers.
But you get the idea. The same could be said of the “videophone” – once a ubiquitous part of any writer’s vision of the future, but now merely an add-on to what the average consumer can do with their smartphone, laptop, desktop computer or even videogame console. But it’s not a replacement for the ‘phone in the way that futurologists of 30 or 40 years ago had once envisioned.
Thirty years ago, it was – and certainly formed a major part of how telecommunications companies saw their future. I remember attending a major industry conference in Geneva in 1988 where there was much excitement about the idea that the videophone was just around the corner.
Only the corner was a lot further down the track than those companies (many of whom were full or quasi-monopolies that have long since been broken up by legislation or as the result of poor management). And the eventual form by which the functions of a videophone became popular were free applications such as Skype and FaceTime that merely enhanced the attractiveness of the platforms on which they were offered.
So while video calling is now a popular part of our culture, it didn’t become so out of a massive demand by customers for a dedicated video calling device – but rather as a result of cheap, easy-to-use software that provided a “nice to have” option for consumers.
None of that is to discount the huge importance of advances such as videoconferencing for learning, medical and business applications, but rather to just underscore that predicting the form and application within which an “obvious” future technology might become successful is a hard thing.
To my mind, that’s one of the struggles facing the 3-D printing business right now. While 3-D printing is doubtless a ground-breaking and rule-changing technology that can fill in a vast number of niches, it is not yet clear what its dominant application will eventually be.
 
So I was intrigued to see the recent release of a new cars for which many of the parts have been 3-D printed - this one from the “microfactory” of Phoenix-based Local Motors. Local Motors is clearly trying something pretty bold and ambitious with its car (including a price tag of $53,000), but as a project to dip an inventive toe into the churning waters of the future, it's hard not to be impressed.
It seems unlikely that this company will give Tesla anything to worry about, but it does provide hope that there are still keen entrepreneurs and inventors out there willing to take a chance on their vision of the future in the hope that others will share it.