January 23, 2012
Stopping the Pirates Without Squashing Everyone Else
Regarding your editorial "Brake the Internet Pirates" (Jan. 18) on the Stop Online Piracy Act and its Senate companion, the Protect Intellectual Property Act: The GOP presidential hopefuls and Rep. Paul Ryan oppose these bills as do virtually every prominent Internet CEO, investor and entrepreneur. I speak not for Google--it has the resources to defend lawsuits and manage regulatory compliance--but for Internet entrepreneurs like myself concerned about our ability to start, operate and innovate growth companies.
We abhor online piracy but these bills won't stop it and instead would impose a censorship regime, regulatory burdens and legal exposure for Internet companies and their users here and around the world. This is ObamaCare or Dodd-Frank for the Internet--perhaps well meaning but introducing disruptive regulatory uncertainty that will throw a monkey wrench into one of the best engines of job creation this country has. Suing Internet companies under SOPA may become the occupation of choice for trial lawyers who cut their teeth on shareholder class-action lawsuits.
While the more draconian components may be removed by recently proposed amendments, as they stand the bills still create censorship in the U.S. while drafting Internet companies to be the enforcers. Once that system is in place, how will it be expanded?
The Constitution grants copyright authority "To promote the Progress of Science and useful Arts." The Internet has promoted such progress by democratizing the creation and distribution of innovations and the arts, enabling individuals to publish their works without having to go through major studios or publishers. This may explain why those industries view some Internet innovation as an existential threat, but it doesn't justify laws that do little more than open legal floodgates for one industry while imposing significant burdens on another, and violating the core idea of freedom of expression in the process.
Christopher J. Alden
Mr. Alden was a founder of Red Herring magazine and is an entrepreneur.
October 6, 2011
I am stunned by the news. I tend to be an optimist about things and I assumed Steve would recover.
It's only hours since the news broke and I'd guess more may have already been written, blogged, tweeted, and updated in tribute to Steve Jobs than perhaps to anyone else in modern times. But writing about Steve is more for the writer, than the reader, so I feel that I have to write. I'll make this inhumanly long so no one will ever read it.
We take for granted the geniuses that invented the world we live in -- a world of light and steel, energy and transportation, food and medicine, entertainment and communication, and of physics, biology, and chemistry -- as they have been outlived by their creations. My kids and grandkids won't remember Steve, but they will live in a world that he helped invent.
It was truly remarkable to grow up as Apple was growing up and it felt like I developed as it developed. It's hard to overstate how much of an impact Apple has had on my life. I'd like to remember Steve first by remembering my life with Apple.
First, I remember Space Invaders on the Apple II. Lemonade Stand, Oregon Trail, Pegasus. Then, programming in Basic. I remember those Apple IIs and their cool, plastic covers. Opening them up. Adding in memory chips. Plugging in colorful wires to floppy disk drives.
Our high school had us all read Orwell's 1984 and I remember the Superbowl commercial. It resonated on many levels.
I remember John Sculley, who had stopped selling sugar water to be CEO of Apple, coming to my high-school (his daughter was a student there), and showing us a slide show of the Mac and how it was made. Cool. I got one. Then the 512. And up and up. As soon as I had a Mac, I wrote all of my high-school papers on it and slaved over my college applications.
I was fortunate to get into Dartmouth -- a school of Macs -- where I spent far too much time playing Risk and Shufflepuck Cafe. I remember getting a HUGE 20MG hard drive. I remember seeing the first color Mac. I remember explaining what "BlitzMail" was to Robert Reich (cool to tell a future Secretary of Labor what email was -- I hope I was the first one). All my work, and a lot of my fun and my social life, went through that computer.
During a summer break I had a job in a real estate development firm and to track leads I built a FileMaker Pro database running on a Mac Portable. Apple won't be putting the Mac Portable on the top of their all time best product lists, but I loved it. It was a Mac.
The next fall, off for the term, I decided to start a company with a friend, Zack, as we had promised each other back in 7th grade that we'd do so. The logical thing to do: teach people how to use the Mac. Thus was born Computer Guides -- consulting, teaching, and classes in how to use the Mac, Excel, PowerPoint, PageMaker, and, of course, FileMaker Pro. We went back to our high-school, where Sculley had first showed us the Mac, and taught classes for parents who wanted to understand how to use these contraptions that their kids were so proficient in.
Back at college, more Blitz, more games, more papers done in Write and Paint, and then graduation. My graduation gift? A Mac. Quadra 950. A huge, beast of a Mac that drove a 21 inch color monitor.
My first month back home, in a house just off Sand Hill Road, I met Tony. He had just left Upside and wanted to start a newsletter about technology finance. He needed cheap labor. And a Mac. I could supply both -- and the Quadra 950 was the top of the line.
Zack joined in and we spent the better part of a year creating a magazine (we decided against the newsletter approach) with a silly name -- all on a Mac. There is not even a chance that we would have started a magazine like this, without any money, if it hadn't been for the Mac and the desktop publishing era it ushered in.
We didn't take it too seriously. Red Herring was going to be a short stint, we were sure, and something that would look good on a business school application. I never made it to business school.
To make money while we worked on the magazine, we went back again to our high-school and became teachers. Computer teachers. Teaching the Mac, and all the wonderful things it could do. Steve Jobs came by one day, checking out the school for his daughter Lisa. Zack and I met him and tried to tell him how important Macs were to the school. But our computer "lab" was, both literally and figuratively, in the basement. Mac should be everywhere, Steve said. Lisa decided to go elsewhere. That year we worked with the school to design a new computer curriculum and a new computer lab, moving it from the basement to the front of the school.
Red Herring launched while we were still teaching, in 1993, and my days and nights were thus spent in front of a Mac -- using FileMaker Pro for the subscription base, editing in Word, publishing and designing with QuarkExpress, Photoshop, and Illustrator.
We covered Apple and NeXT and Pixar. We interviewed Steve Jobs, the second chance I had to meet him, and declared "He's Back" prematurely. This was before he came back to Apple. I remember we asked him what was wrong with Microsoft. He said "They have no taste." What an odd thing to say. Tech people didn't care about taste, design, or style. Style matters for life, this was tech. Those things were separate. Tech cared about functionality and featrures, not design.
Meanwhile, Apple was struggling. I have to confess that frugality got the better of me and as our company grew into the hundreds, I moved away from Mac. Steve wasn't there and Apple wasn't as magical then. PCs did the same thing and were cheap.
Apple needed fixing and its leaders were getting it wrong. We had to speak up and in 1997 we wrote, in one of the pieces I'm proudest of, an open letter titled "Gil Amelio, Please Resign." I actually liked Amelio as a person -- he was very brave and gracious to meet with us after we published that letter -- but he was no Steve Jobs. A few months later he resigned (reportedly forced out by the board) and I seem to remember reports that pressure from the media was cited as a factor. I don't know how big of a straw that open letter was on that camel's back, but it was a straw nonetheless and perhaps -- just perhaps -- it really helped galvanize a growing sentiment that Amelio was just wrong for the job. He was replaced by Steve Jobs.
Steve told Apple, and then he told the world, to Think Different and thus emerged the "i" decade. iPod. iPhone. iTunes. iPad. My story with Apple this last decade is much like many others: the music, the fun, the games, the apps. The magic.
I got to see, but never speak to, Steve a couple times -- at announcements and conferences -- and every time he seemed even wiser and more insightful. Or had I simply fallen more under his spell?
Last year I went back to my high school and talked about technology -- and Apple. I'm convinced that iPads will transform education the way Macs, consigned to the basement, never could. iPads can be in your hands -- be in everyone's hands -- and they are much cheaper, much easier to use, and, most importantly, can call up text, video, and apps from around the world. It won't be long before every kid, by a certain age, has an iPad, and every teacher (well, the good ones) will use it in some way to transform how they educate, and every parent (well, the good ones) will use it to ensure their children, especially those whose schools are failing them, will be able to access a world of brilliant teaching, lesson plans, reference materials, Kahn academy videos, and whatever great innovations lie ahead from the millions of creative individuals who want to improve how we educate our children. In fact, it may be that the iPad will be the primary window into the worlds' best educators, courses, and apps while the role of parents, teachers, and schools change more in the next 10 years than they have in the last 100.
I feel so strongly about this that I bought our high school its first batch of iPads and am already thrilled by how they are being used. It was fitting to help bring a little Steve Jobs magic back to a school that helped bring that magic into my life. A school that first showed me the Mac and then brought me back to teach the Mac to expand the role it played on campus. A school that missed out on having Steve's oldest daughter attend but now has a student, using an iPad in an innovative pilot program, whose father changed the world forever and has now departed, leaving all of us to grow, create, and thrive in the world he invented: Steve Jobs.
I should end it there, but I won't. Because I feel that this creative genius has left a void. And I have a call to action: to myself and to everyone. We need to fill this void. Few, if any, will be able to match or exceed what Steve has done, but collectively, using much of the infrastructure Steve helped inspire and create, we can do it.
But will we? It's worth listening or reading Steve's 2005 Stanford commencement speech. He talks about life, about adversity, and about death. We see the brilliance of a man who didn't just overcome adversity but used it. He didn't wallow in self-pity. Adversity motivated him. Had he never dropped out of college because his parents couldn't afford the tuition, would he have started Apple in a garage? Had he not been fired from Apple and experienced "the lightness of being a beginner again," would the "i" decade have ever happened? Had he never been diagnosed with pancreatic cancer, would we have the iPad?
Perhaps. But when adversity and character blend in extraordinary people, the world changes. We face adversity now -- Steve has left us. But do we have the character? Will this demoralize or motivate us? Is this the end of an era or the beginning of one?
I should also fight the urge to make broad social commentary, of the sort I usually recoil from when offered by others, but for some reason I just can't resist some sanctimony. To wit: have we lost our way? Why have we allowed Steve Jobs to out innovate all of us for three decades? Do we, as a society, have the character to confront adversity and use it as motivation, rather than use it as an excuse for our failures and justification for self pity?
Something alarming has happened to humans lately -- we haven't been innovating the way we used to. As dramatically as life has changed in the last 10, 20, even 30 years, it changed more quickly and more profoundly in the decades and centuries before Apple was founded -- when measured per capita. The Apple years, that is, have seen less innovation from people than any other 30 year period that came before it, going back centuries. Steve did his part. But the rest of us, as a whole, have not.
Why is this? I don't know. Some believe that the "low hanging fruit" has all been eaten -- land has been settled, society was industrialized, cheap energy was tapped, quick wins emerged as we figured out the fundamentals of physics, chemistry, biology, medicine, etc. The big problems of energy, health, transportation, communication, et al. have all been been cracked, albeit not completely nor spread universally, and the big steps have been taken. We have nothing left to tap into. No more great leaps to take.
Maybe. But I'm not convinced we can see low hanging fruit when it is in front of us, so more may still lie ahead.
But are we looking for it? Society was obsessed through the ages with growth, building, prosperity, greatness, and conquest. Yet as we started eating, not jut picking, the low hanging fruit we began to spend more time contemplating the higher tiers of Maslow's pyramid. Societal progress isn't smooth and growth often brings periods of dislocation. Revolutions happen. Change is jarring. The 1960s galvanized an era in which we focused less on greatness and conquest and more on self-reflection, judgement, and improvement. Despite the fantastic improvements in the human condition that innovation ushered in over centuries, there have always been "stasists," as Virginia Postrel calls them, who have fretted more about the downside of growth and innovation than the advantages. Jeremy Rivkin was at the same event we organized at Dartmouth when I met Reich. I remember him pointing out that while the arms of a classic wristwatch moved, digital time pieces did not have moving parts, depriving us of a visceral sense of time -- of a connection to reality. A powerful metaphor. And drivel.
The 1960s were vital to our development and transformed our world for the better in many profound ways. But it also ushered in a new age of doubt and guilt: what had our new physics, new chemistry, new biology, and new medicines wrought? It took us millennia as a species, and countless geniuses, to achieve a relatively bountiful world of light and steel, energy and transportation, food and medicine, entertainment and communication, and of physics, biology, and chemistry. It has been our tireless quest, ever since we evolved a will and an intellect, to develop cheap energy, plentiful food, rapid transportation, low infant mortality rates, and long life expectancies. And just as we got them, we started worrying about them and what they were doing to us.
Over-population, GMOs, nuclear energy, pesticides, deforestation, ozone depletion, extinctions, acid rain, global warming, have (in the developed world) replaced hunger, want, deprivation, disease, and all sorts of corporal misery and physical trauma, as the beasts that society must slay. To fight these maladies, which some blame in part on growth and innovation, we developed anti-bodies that curbed growth and innovation and brought us a new mindset. We are now "destroying the planet" while most of our history as a species has been trying to stop the planet -- its animals, its bacteria, its storms, its ice ages and warming periods -- from destroying us. We haven't attacked the planet, we counter-attacked it. We survived! Then we surveyed. And now we sulk. We lament our "addiction" to oil. Suddenly, the boon of cheap energy, the ether of our prosperity, the manna that every generation before ours would have, literally, killed for, has become a villain, not a hero. Suddenly, instead of thinking about how big of an impact we can have on the planet, we focus on how small of an imprint we should make. Kids are taught in school to be "green" and leave small foot prints, not big ones. We don't want them to build big, bold things -- things that shape the planet and adapt it to our needs -- we want them to adapt to what the planet needs. We recycle. We reuse. We reduce. We repent.
Humanism is out. Naturalism is in. The planet doesn't work for us anymore, we work for it. Freeman Dyson, a genius, has been ridiculed and ostracized for suggesting that it should be the other way around.
We aren't focusing outward, upward, or even forward. We are focusing inward. Many of our tech inventors just spent a decade on "social" innovation and the new tech billionaires are people who help us connect, remember, get in touch, share, and, lord help us, tweet. This isn't about shaping the world, it's about hanging out in it. We innovate in virtual space and create new worlds, because real space is off limits. Real space people, who mine and drill the earth, who develop big buildings and use lots of energy, who burn things and build things and chop things down, are dirty at best and villains at worst. If they make a profit, it's a windfall. If they make too much money, it's unfair. They are to be resisted, regulated, protested, and taxed.
Meanwhile, how much time has been spent, or squandered, by talented young people who've decided that the best use of their energy, imagination, and intellect is to try, through facile pattern matching rather than true innovation, for a one in a million shot of becoming the next Marc Zuckerburg?
I think it's time for synthesis. I remember studying Hegel's dialectic at Dartmouth: thesis, antithesis, and synthesis. Let's accept the conservative thesis that prosperity through industry and innovation, not the leviathan, has delivered us from lives that were solitary, poor, nasty, brutish, and short. Let's accept the progressive antithesis that this prosperity has had costs as well as benefits and has not been equally distributed. But then let's move beyond conservative and progressive and move to synthesis: unleash the power of innovation and the forces of prosperity and point them at real problems. Prosperity hasn't created the our problems, it just hasn't fixed them all -- yet.
Which brings me back to Steve Jobs. He focused on big problems and developed bold and brilliant solutions. He figured out what we needed before we knew we needed it. He bent the world to his will and he left a big footprint. Confronted with adversity, he was the Atlas that didn't shrug. He told those graduating Stanford students: "Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma -- which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary."
The first wave of the Internet was built on the personal computing infrastructure he created. That dot com boom built the infrastructure -- the access, the browsers, the search, the stores, the software, the servers, the networking equipment -- upon which the second wave, Web 2.0, was built. This social wave, combined with the new accessibility of Steve's "i" decade -- devices that are easy to use, touchable, mobile, and full of music, movies, and apps -- has created a whole new foundation for us to build on and new materials to build things with. We shouldn't ridicule the world of nearly 8 million people following @aplusk or millions more playing Angry Birds. It helped build these platforms.
But for the third wave, let's be bolder. Let's be more audacious. Let's build on this foundation and try to change the world. For real. Let's confront the adversity of a world without Steve and let it motivate us, not demoralize us. Let's rally to the cause and following his example let's fill the void he's left -- and then some. Let's not wallow in guilt about the state of the planet nor deny the damage that is being done -- let's set ourselves to making things better through our creativity and industry. Let's not waste time with small permutations on other people's big ideas. Instead, let's resist dogma and have the courage to follow our heart and intuition. Let's be hungry. And foolish.
December 20, 2010
I was just reading through an old Red Herring magazine. Red Herring Issue 55 was our 5th Anniversary Issue, published in June 1998, and it featured Jim Barksdale on the cover in front of a red curtain, exiting stage left - a metaphor for Netscape's surrender in its battle against Microsoft. It's remarkable that even in the summer of 1998, four years after Netscape was founded, we were still talking mostly about the basic building blocks of the Internet (remember the browser wars?) rather than what we were going to build with those blocks. That issue's major focus: Java. Amazon, eBay, and Google weren't even mentioned in that issue - the last one was probably because Google hadn't been started yet :).
Issue 55 was a major redesign, crafted by the famous magazine designer Roger Black, and was our "pivot" (though we didn't use the word back then) from an insider Silicon Valley finance magazine to a "business of technology" magazine - a business magazine, first and foremost, focusing on what we thought was the most interesting part of business: technology, innovation, and entrepreneurialism. Technology, we felt, had moved from a vertical industry to something that touched all of business, and this issue was our adjustment to that reality.
Encapsulated in that issue, which was our optimistic appraisal of the future of technology, was a contrast of two very different economists: Paul Krugman and Julian Simon. I suspect that of the two, many more people know Krugman (who is now a columnist for NYT and regular on ABC's This Week) than Simon. And that's a real shame.
The two economists appeared in different ways. Krugman wrote a deeply, though unintentionally, ironic article titled "Why most economists' predictions are wrong," filled with predictions that were... mostly wrong. Simon, sadly, had just passed away and so his appeared in the issue was by way of David Henderson's remembrance of him.
Krugman is probably one of the most, if not the most, influential economist in the lay world today so it's interesting to see how well his predictions have stood the test of time.
Krugman's article, written in 1998, is a missive against "overly optimistic economic forecasting," makes remarkable claims like: "when all is said and done, the technological progress we keep hearing about is occurring in only a small part of the economy," and "The truth is that we live in an age not of extraordinary progress but of technological disappointment."
He went on to make some specific predictions, all of which were either mostly or completely wrong:
"Productivity will drop sharply this year."
Nope - didn't happen. In fact productivity continued to improve, as this chart shows:
"Inflation will be back. ...In 1999 inflation will probably be more than 3 percent; with only moderate bad luck--say, a drop in the dollar--it could easily top 4 percent."
Nope - that didn't happen either. Inflation in 1999 was 2.19% and hasn't gone above 4% since Krugman wrote this piece.
"Within two or three years, the current mood of American triumphalism--our belief that we have pulled economically and technologically ahead of the rest of the world--will evaporate."
Nope -- that didn't happen, either. Though September 11th, which happened more than three years after this article, and the Lehman Brother's collapse, which happened more than 10 years after this article was written, have certainly reduced American triumphalism. Here is where I think Krugman may have been the most right, albeit it way too early.
"The growth of the Internet will slow drastically, as the flaw in 'Metcalfe's law'--which states that the number of potential connections in a network is proportional to the square of the number of participants--becomes apparent: most people have nothing to say to each other! By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine's."
"As the rate of technological change in computing slows, the number of jobs for IT specialists will decelerate, then actually turn down; ten years from now, the phrase information economy will sound silly."
"Sometime in the next 20 years, maybe sooner, there will be another '70s-style raw-material crunch: a disruption of oil supplies, a sharp run-up in agricultural prices, or both."
Meh. While have seen oil prices spike (although they have yet to reach the annual peak we saw in 1980), this was not due to a crunch or disruption or running out of oil (we have more known oil reserves now than when Krugman wrote this article) but rather growth in demand.
This whole article was supposed to be proof that some economists are overly optimistic, but in retrospect all this did was prove that Krugman is overly negative. Not only were all of his predictions wrong, but they were wrong in the same direction: they were all too negative. (And in an article in which he was criticizing another economist for being too optimistic!)
Contrast this with Julian Simon. Where Krugman has underestimated humanity, Julian Simon looked at the evidence and realized that humans, and our ability to innovate, was a huge resource. Here's a passage from Henderson's article, and the whole thing is worth reading.
Simon seemed to deny the obvious. He said that resources were not finite. He hadn't always believed this: in the late '60s, he wrote papers advocating the use of economic incentives for women to have fewer children. In the preface to his 1996 book, The Ultimate Resource 2, he explained that in the '60s he had "aimed to help the world contain its 'exploding' population, which I believed to be one of the two main threats to humankind (war being the other)." But Simon found that the evidence didn't support that gloomy view--if governments left people free to innovate, invest, and create, he determined, population growth did not, in fact, hinder economic development, reduce the standard of living, or dry up natural resources. So Simon actually changed his mind, something that is much rarer among academics than you might think.
How could population growth not reduce resources? It is true that in the short run, population increases drive up demand for natural resources and thus their prices. But then the high prices prompt entrepreneurs and innovators to find new resources, or new ways of getting existing resources more cheaply. The net result: resources are more plentiful and cheaper than they were before the population grew. In The Population Bomb, Mr. Ehrlich generalized from animal behavior--he had studied butterflies--to human behavior. But Simon saw humans as fundamentally different from animals. He liked to quote the 19th-century American economist Henry George: "Both the jayhawk and the man eat chickens, but the more jayhawks, the fewer chickens, while the more men, the more chickens."
This holiday season, I will be remembering Julian Simon.
From FCC Commissioner Robert McDowell in the Wall Street Journal:
For years, proponents of so-called "net neutrality" have been calling for strong regulation of broadband "on-ramps" to the Internet, like those provided by your local cable or phone companies. Rules are needed, the argument goes, to ensure that the Internet remains open and free, and to discourage broadband providers from thwarting consumer demand. That sounds good if you say it fast.
Nothing is broken that needs fixing, however. The Internet has been open and freedom-enhancing since it was spun off from a government research project in the early 1990s. Its nature as a diffuse and dynamic global network of networks defies top-down authority. Ample laws to protect consumers already exist. Furthermore, the Obama Justice Department and the European Commission both decided this year that net-neutrality regulation was unnecessary and might deter investment in next-generation Internet technology and infrastructure.
On the FCC blog, Julius Genachowski, Chairman of the Federal Communications Commission wrote "We must take action to protect consumers against price hikes and closed access to the Internet--and our proposed framework is designed to do just that: to guard against these risks while recognizing the legitimate needs and interests of broadband providers." What price hikes? What closed access? We've had a consumer Internet for over 16 years and yet we haven't seen these problems in any significant way -- only imagined "risks," conjured up in tech conferences and over hyped by large corporations trying to bugger other large corporations. Protecting against "risks" that haven't materialized is a very low bar for this kind of massive government intervention into the Internet industry.
April 23, 2010
Did you know this?
Mr. Dodd's bill would change all this for the worse. Most preposterously, it would require that start-ups seeking angel investments file with the Securities and Exchange Commission and endure a 120-day review. Rare is the new company that doesn't need immediate access to the capital it raises, and a four-month delay is the kind of rule popular in banana republics that create few new businesses.
The legislation also removes a federal pre-emption that prevents start-ups and investors from being subject to 50 different state regulators. The North American Securities Administrators Association, which represents state regulators, argues that federal pre-emption contributes to fraud. But angel investors don't use broker-dealers and other middlemen linked to recent investment scandals. Nascent companies often seek financing from multiple investors in different states, and a state-by-state regulatory regime would mean higher compliance costs and more legal risks.
If this provision passes, it would be devastating. And even if it doesn't it's more proof that the folks in Washington DC just don't get Silicon Valley or entrepreneurialism. Where is the compelling interest for the SEC to impose this kind of hurdle in front of start-ups?
August 9, 2009
I would rather see the White House and Congress work on a pro-growth and pro-jobs agenda first and foremost that would include lowering, not raising taxes and less, not more intervention in private industry. While I feel better about the short term than I did six months ago, since we were then facing the possibility of Armageddon, I am now more pessimistic about the long-term outlook.
Here's the unabridged version:
I think many in Silicon Valley would like to see the administration pursue more pro-growth policies. The start-up, angel investor and venture capital industry has helped build companies that have created a huge amount of jobs and with unemployment moving towards 10% it's an industry that should be encouraged and supported. While the Obama campaign said it would eliminate capital gains taxes for start-ups, instead the industry is looking at substantial tax increases on business, income, capital gains, and carried interest -- not to mention the energy and healthcare taxes now being debated in Congress -- and the administration has suggested it might force VC funds to register with the SEC. This is not what the Valley needs in order to resume being an engine of job creation.
There is real concern here that spending has been excessive and not been used wisely, and may in fact be crowding out private investment. I've heard anecdotes of companies in the telecom, energy, and healthcare industries holding off on investments because they are waiting to see if they can get bailout money.
On the issue of free trade, the "buy American" provision of the stimulus bill was probably unhelpful to the cause, and I'm concerned that agreements with the likes of Colombia and South Korea may be stalled. When it comes to the issue of H-1B visas, this is still important to the Valley, but with such high-unemployment I doubt there will be any political will to raise caps.
My concern is that while we may have avoided the worst of it, unemployment keeps rising and the danger of a double-dip still looms. It seems as though the Obama Administration has moved on from the economy and is focusing more on its healthcare and energy agendas. While those are important issues, I would rather see the White House and Congress work on a pro-growth and pro-jobs agenda first and foremost that would include lowering, not raising, taxes and less, not more, intervention in private industry. I am very concerned that the tax and spend policies of this Administration will result in very slow growth for the foreseeable future. While I feel better about the short term than I did 6 months ago, since we were then facing the possibility of Armageddon, I am now more pessimistic about the long term outlook if the heavy taxing, spending, and intervention into private industry doesn't abate.
July 24, 2009
The car industry today is as vertical as the computer industry was before the PC. However, the simplicity of the electric car combined with the standardization of certain components may cause the automobile industry to shift to a horizontal structure. The Internet is already emerging as a key marketing medium for automobiles and is easily adaptable to a horizontal structure.
July 21, 2009
Ex nihil nihilo: out of nothing, nothing comes. It is as though corporations could work their way to greater wealth by making stock dividends to their shareholders. One might as well increase the size of a cherry pie by increasing the number of slices.
June 8, 2009
April 15, 2009
SGI's high-performance, highly-proprietary, computing systems fell victim to the spread of cheap Linux boxes hooked up together with massive redundancies.This from Red Herring, September, 1995 in an Open Letter to Ed McCracken, CEO of Silicon Graphics:
In 1992 you bought MIPS, the microprocessor designer and manufacturer, because you were its last major customer and needed to guarantee that SGI could continue to use MIPS chips. Now you rely completely on that technology. In the meantime, Intel and Motorola/IBM/Apple have each spent a billion dollars on their Pentiums and PowerPCs--Intel will spend $3.5 billion in capital investments and $1.3 billion on research and development for 1995 alone! And this investment will only grow. In 1994, Intel-based systems doubled in price/performance. With volumes in the tens-of-thousands, not in the tens-of-millions, how can SGI compete? And stiff competition is just around the corner: a Windows NT system with multiple P6s on the motherboard will compete with your high-end machines at a fraction of the cost.This seemed obvious in retrospect, but it wasn't obvious at the time, and I think we were the first magazine to really call attention to this trend. I'm actually quite proud of this article and remember being quite nervous about it. After all, who were we to openly challenge the brilliant minds at one of the Valley's hottest companies? I was 25 at the time and had zero direct experience in the industry that would qualify me to make a sound judgment on SGI's strategy.
But we were confident in one thing: we actually talked, and listened, to some of SGI's best customers (even after almost 15 years I won't reveal the sources!) and what they were saying made sense. Their customers didn't think SGI was listening, so the main goal of our letter was to urge them to do just that:
Don't Alienate Your CustomersSGI's response to our article was telling. Rather than following our advice, they invited us to their offices, loaded the room with some of the smartest people I'd ever met, and tried to convince us why we were wrong. They had convinced themselves that they COULD in fact compete with the wintel price/performance trend and I guess they thought that if they could convince us that their strategy was sound, we would in turn convince their customers through our writing. But that fundementally misunderstood our role at Red Herring. SGI customers influenced US more than we influenced them -- so SGI didn't get how the patterns of influence worked. You don't change customers minds through PR, you change their minds by listening, engaging, responding, and adjusting.
Regardless of the strategy you choose (or may have already chosen), we urge you to work closely with your customer base and announce your long-term plans--or the grumbling masses of graphics professionals who depend on SGI will mutiny, and turn to Intel and Microsoft for leadership. We realize that abandoning your MIPS-based systems will threaten short-term sales, but, right now, the greatest thing SGI has going for it is its momentum. Don't spoil that by alienating your loyal customers.