Wednesday, April 27, 2011

Thoughts on the Irish Banking Crisis and the Nyberg Report

I just got through reading “Misjudging Risk: Causes of the Systemic Banking Crisis in Ireland.”  It’s a recently published report of the commission of investigation into the banking sector in Ireland and it was written by Peter Nyberg.

Mr. Nyberg is a Finnish economist with a background in financial regulation.

The Nyberg report paints a sobering picture of the Irish banking industry in the period under investigation from 2003 to 2009.

The crisis was caused by the unhindered expansion of a property bubble financed by the banks using wholesale market funding.

Investors considering investing in banks might want to take note of some salient conclusions:

“The Commission has…explained the crisis essentially as a consequence of applying a na├»ve version of the efficient market paradigm, supported by groupthink and herding. This helped create and strengthen a mania in the Irish property market. Professionals and non-professionals alike became convinced, and convinced each other, that financial markets were stable by themselves, despite historical evidence to the contrary.”

“The Commission has been widely assured by bank management, non-executive board members and others that the problems in banks' loan books came as a complete surprise. There is regret, incredulity and guilt among them at the lending and funding policies pursued and the lack, at the time, of any recognition of what was happening. The credibility of their assertions is increased by the fact that a number of them personally suffered substantial losses in the crisis, easily avoidable if advance warnings had been available and recognised.”

“Because the real reason for the crisis is the spread of an ultimately irrational point of view, regulations and watchdog institutions cannot be counted on to be efficient preventers of a systemic crisis. As has been seen in Ireland and other countries, central bankers and regulators embraced much the same paradigm as the market participants and adapted their policies to their convictions. The result, as shown by the crisis itself, was that no effective brake on risk-taking existed for years. It does not appear wholly unfair to propose that this is what may happen also in the future if and when another new financial or banking paradigm appears. Many of the very reforms that recently have been undertaken, at short notice, to shore up the functioning of the present financial system could turn out, once again, to be ineffective.”

To summarize, groupthink and herding behavior contributed to the crisis, bank management—the individuals most likely to be in position to understand what was going on—were clueless and regulations and watchdog institutions cannot be counted on to prevent future crises.

All in all the report serves as a cautionary tale for those considering taking debt or equity positions in the banking sector.

Any person considering an investment should seek independent advice from an investment advisor on the suitability of the particular investment.  Blog postings are for educational purposes only and do not constitute a recommendation for the purchase or sale of any security. The information is not intended to provide investment or financial advice to any individual or organization and should not be relied upon for that purpose.

Sunday, March 13, 2011

Investment Strategy in the Information Age:
A Discriminating Approach to Technology Investing

Note: This article was originally written in late 2002 and early 2003.

“The Information Age is an infinite loop of inspiration, opportunity and hope.
It is the geometry of dreams: what we imagine, we can build.
What we build inspires us again.
And the cycle of invention continues”.

 - Applied Materials 1996 Annual Report



By the middle 1940s, the American telephone industry was faced with a problem.  As the post World War II economic expansion took hold, the overall volume of telephone calls was beginning to rise dramatically.  The phone system routed a call by passing it along a network of switches.  Each switch made an individual routing decision and then passed the call on to the next switch in the network.  However, rising call volumes were creating a growing number of possible connections that, in turn, demanded increasingly faster switches and tougher routing decisions.  At the time, telephone operators still connected phone calls manually.  As a result, growing call volumes and increasing routing complexity were leading to exponential growth in demand for telephone operators.  Without new techniques to route calls, eventually the phone system would require all the women in the United States to be operators.

In response to this dilemma, a small team of engineers at Bell Laboratories began research on developing a routing mechanism.  The engineers sought to make meaningful advances over conventional technologies such as mechanical switches and vacuum tubes that carried limitations—vacuum tubes were bulky and consumed a lot of power.  After a great deal of trial and error, the crucial breakthrough was achieved in 1947 when the team of John Bardeen, William Shockley and Walter Brattain invented the transistor at Bell Labs in Murray Hill, New Jersey.  For their invention the team was awarded the 1956 Nobel Prize in Physics.

A transistor can be thought of as a small switch.  It is a tiny device that channels electrical signals through a gate.  And while its original purpose was to serve the telephone industry, the transistor has proven to have far-ranging applications beyond the scope of telephone call routing.  Transistors eventually led to the microchip (also known as the microprocessor or semiconductor), which is an integrated circuit containing millions of transistors on a tiny slice of silicon.  It has served as the core building block for information technology and the digital revolution.  The microchip has transformed life on earth in both the workplace and the home.  Since the transistor’s invention, processing power has increased at an exponential rate while cost has decreased in a spectacular fashion.  Between 1958 and 1980, the time required for a single computer operation fell by a factor of 80 million.  And in the late 1990’s, the $199 Nintendo Game Boy offered more computing power than the fastest Cray supercomputers of the early 1970s.

It is tempting to treat the computer as just another tool.  Ever since the human race graduated to walking on two feet we have used our hands to create tools.  The use of tools is part of what separates the human species from animals.  If there is anything new going on—astounding technological progress during the latter half of the 20th century indicates that there may well be—it has to do with the growth of information technology.  Information technology (IT), which includes computers and telecommunications, is unique in its ability not just to record, store and access information—books provide that functionality—but also to modify, organize, analyze, share and transmit data over vast distances with speed and accuracy.  Businesses have embraced computer-based IT, restructuring their operations and gaining substantial efficiencies in the process.  Prior to the arrival of information technology and computers, progress in the industrial age depended on acquiring scarce physical resources and applying manufacturing techniques to produce physical products.  But in the current information age, the network replaces the assembly line and physical constraints fade away.  Unlike physical resources mined in the ground, knowledge is not subject to the law of conservation—each generation can draw upon past insights and innovations.  And information technology allows people to leverage the power of ideas, information and knowledge to an unprecedented degree.  When information is centralized in space and time it has the potential to reveal fundamental truths—the only limit is human intellectual capacity.

While advances in technology have led to vast increases in economic productivity and shareholder wealth, it is important to state that the relationship between technological progress and shareholder wealth is far from transparent or uniform.  As a case in point, the Internet has provided consumers with more choice and lower prices while many businesses have experienced lower profitability as a result of stiff global price competition.  Recent history illustrates that many traditional, old-economy firms that do not produce technology have built tremendous shareholder wealth—firms such as Wal-Mart, Coca-Cola and Home Depot.  Yet these leading firms have demonstrated an ability to exploit technology within their operations.  Conversely, many technology companies have produced marginal profitability and dismal returns for their shareholders.  For instance, most of the firms went bust at the turn of the century.

This article discusses the repercussions of technological innovation on investment strategy.  I look at how firms that develop and market technologies shape up as investment candidates, exploring the pitfalls and unique risks related to investing in technology companies.  But I also discuss the broader implications of technology and the information age.  The article ventures into the field of behavioral finance to discuss human biases present in the investment decision-making process.  Innovation and disruptive technologies are studied, as are issues of scale and standardization.  Next, I focus on one of the best-performing sectors of technology—the software industry—looking at key factors such as network effects and switching costs.  Then I show how leading firms from a variety of industries—including traditional old-economy sectors—have used technology to their advantage.  Finally, I wrap up with a look at how technology has affected industry and societal trends.

A firm’s long run stock market performance success is determined by its ability to achieve sustainable competitive advantages.  Whether a firm produces low or high technology products, profitability is influenced to a large degree by the industry structure and the nature of a firm’s business model.  The greatest investment rewards will continue to come from industries featuring high barriers to entry and individual businesses that present competitors with firm-specific competitive barriers.  Barriers include:
·           High switching costs for customers
·           Large minimum capital requirements, efficient manufacturing scales or steep economies of scale
·           Brand identity and the reputation that comes from strong, differentiated brand names
·           Enforceable patents
·           Proprietary product differences, trade secrets and unique know-how
·           Preferential access to distribution channels in major markets
·           Exclusive control of key raw materials
·           High regulatory costs for achieving product approval that act as a barrier to new entrants
·           Exclusive or preferential market rights conferred by governments
·           Unique human resources capable of superb execution such as extraordinary management talent

Technology versus Pragmatism

German writer and philosopher Johann Wolfgang von Goethe used to keep rocks on his desk to remind himself to avoid abstractions.  Goethe believed that, in order to maintain the reader’s attention, a good writer should focus on specific details instead of elaborate theories.  For investors seeking to add technology companies to their portfolio, they might want to take note of Goethe’s habit.  As in writing, failure to take care of practical details can doom even the most promising technologies.  History is littered with visionary ideas and elegant technologies that failed to live up to commercial expectations because they made no practical business sense.  In order to harness the potential of technology, business leaders put technology to practical use—converting ideas and innovations into light bulbs, automobiles, airplanes and computers.  These modern conveniences serve a need and provide genuine value.  Market success requires more than just a bunch of engineers solving technical problems—it is not enough to invent a great technology.  The technology must have profitable applications that solve practical problems facing a firm’s customers.  To successfully bring a technology to market, it requires a constant focus on customer needs, costs and changing market dynamics.  Research and development is no substitute for taking into account consumer behavior, for creating value for the customer and for communicating those benefits to the customer.  While universities can pursue pure scientific research, businesses and serious investors have to focus on making money and innovative technology is only one piece of the puzzle.

In addition to a viable business plan, firms must be able to execute the plan.  Ambitious and complex plans for new technologies often look appealing on paper, but tend to fall apart in execution—as the saying goes, success is 95 percent follow-through.  Even companies blessed with superior business models must be able to execute.  Direct sellers Gateway and Micron Technology have business models similar to Dell yet Dell has outperformed these peers through better execution.  During the heady days of the frenzy, entrepreneurs often neglected fundamental practical considerations in both their business models and their day-to-day operations.  There was a focus on capturing orders over the web while neglecting practical functions such as back office fulfillment and logistics.  Regardless of the technology, the physical issue of getting a package from Point A to Point B remains—there is no benefit to allowing a customer to place an order over the web in four minutes if it takes four weeks for that customer to receive the order.  Many web retailers neglected to foresee the required labor costs in serving the consumer segment.  In order to deal with customer inquires by phone or e-mail, a large-scale e-commerce site requires a customer service call center entailing a substantial capital investment.

In the mid-to-late 1990s, the Internet sector experienced explosive revenue growth.  However, the vast majority of Internet firms had no earnings, no dividends and few tangible assets.  Revenue growth alone is insufficient if firms cannot demonstrate that those revenues can lead to profitability.  Eventually, investors ran out of patience with the dot.coms and the sector went for a colossal cordless bungee jump.  The lesson is: as wondrous and as transforming as the digital revolution has been, it has not eliminated the need for a firm to earn profits; it has not repealed the business cycle; and it has not eliminated the tendency of stock markets to experience unsustainable speculative bubbles.


Iridium LLC was a global satellite telecommunication consortium founded by Motorola Inc.  Conceived by Motorola engineers in the Arizona desert in 1987, Iridium spent over $5 billion deploying a network of 66 mobile telephone satellites in space.  The system promised to provide mobile phone service “with anyone, anytime, virtually anywhere in the world.”  When the project commenced, Wall Street was bullish on Iridium and the mobile phone market appeared to show potential for a satellite phone service.  The ambitious undertaking required 12 years and 20 million lines of computer code to build.

Unfortunately, the project experienced cost overruns and by the time the system was up and running, market dynamics had changed—the rapid expansion of cheap land-based capacity had captured Iridium’s target market.  Iridium had targeted high-end consumers who traveled internationally but this segment was unwilling to pay $3,000 for bulky handsets and $7 per minute for telephone service.  Motorola did not foresee rapid growth in the land-based cellular telephone industry and how inexpensive service would become.  There was simply no viable mass market for Iridium’s service.  The only demand was from a small niche market of customers without access to conventional phone services such as companies working in remote locations (worse still—Osama bin Laden is reported to have purchased a satellite phone from a New York-based company in 1996).  With just 55,000 subscribers, the business was unable to pay the interest charges on its start-up costs.  Iridium defaulted on $1.5 billion in loans and filed for bankruptcy protection in 1999.  Motorola was forced to write off more than $2.5 billion.

The story of Iridium highlights the risks of investing in new technology.  Although labeled Iridiots by some, Motorola and its partners had strong technological skills and business acumen—the venture had the support of Wall Street at the outset.  The problem lies with the fundamental nature of large-scale technology projects.  There was no way to gauge the true cost of building such a complex system until it was already a fait accompli.  And there was no way to know in advance how the mobile market would develop by the time the system was fully operational.  The best the consortium could have done was to recognize the changing market and cost issues earlier on and shut down the project sooner.  In the end, when it comes to technology investing, there is a degree of unavoidable risk that goes with the territory.


If we compare a typical high technology company with, for instance, a consumer products firm such as Proctor and Gamble we will find that the consumer products company will have steadier and more predictable returns.  Compared to the high tech company, it’s easier to understand P&G’s business model.  The consumer products industry is also more uniform with each industry participant more similar than each participant in the high tech industry.  These factors imply that cutting-edge technology firms are risky investment prospects—we often see unique business models that have not been proven to the same degree as traditional industries have.

Lack of predictability can also be a problem with technology firms that have proven themselves in the past.  For a technology investor, success requires more than knowing that businesses such as Dell and Cisco Systems can generate astounding returns—there has to be a way to predict those returns.  There has to be a way to find a pattern to successful technology companies or else an investor is stuck looking into a rear view mirror when the traffic ahead is moving swiftly.  Unfortunately, it is next to impossible to predict the long-term economics of many technology companies.  Over a period of five to ten years, a technology company can see its fortunes change unexpectedly and dramatically and sometimes, as Iridium’s story shows, the outcome is not pretty.

There a few key driving forces that affect business and society in profound ways.  In the 20th century, factors such as war, demographics, technology and globalization influenced world economies and societies.  Ever since the post World War II baby boom of the 1950s, demographic trends have exerted powerful forces and businesses have targeted this massive population bulge for economic profit.  The baby boomer effect has also shown a degree of predictability in its pattern.  If demand is a function of the consuming population then we can predict the size of a particular population segment at a particular moment in time.  By contrast, the pattern of technological change is more haphazard.  Futurist John Naisbitt said it best when he wrote:  “Technological innovation rarely goes the way of straight-line extrapolation, but proceeds as part of a lurching dynamic of complicated patterns and processes.”  Without the element of predictability, technology bets become highly speculative.

Legendary investor Warren Buffett wrote the following passage in a 1994 letter to shareholders of Berkshire Hathaway:

“Investors should remember that their scorecard is not computed using Olympic-diving methods: Degree-of-difficulty doesn’t count. If you are right about a business whose value is largely dependent on a single key factor that is both easy to understand and enduring, the payoff is the same as if you had correctly analyzed an investment alternative characterized by many constantly shifting and complex variables.”

If you can take away a single concept from this article, I hope that you contemplate the above pearl of wisdom and apply it to your investment approach.  The message is particularly salient given certain shortcomings that humans tend to exhibit in decision-making.


Behavioral Finance is the study of how people interpret information when making investment decisions.  Studies from the field of Behavioral Finance show that humans are prone to certain cognitive illusions, biases in judgment and emotional weaknesses in their investment decision-making.   

With the exception of meteorologists and racetrack handicappers, researchers have found that most professionals demonstrate overconfidence in their abilities.  Furthermore, most people are prone to excessive optimism.  For example, over 80 percent of drivers believe they are above average in their driving ability even though the true ratio must be around 50 percent.  Overconfidence and excessive optimism combine to result in the tendency to overestimate one’s knowledge, underestimate risk and exaggerate one’s ability to control events.  Findings indicate that investors are consistently overconfident in their ability to beat the stock market, interpret luck to be the result of skill and often see patterns and causal factors at work where there are none.  Indeed, achieving better-than-average returns in the stock market is a zero sum game—for every winner there must be a loser, therefore, everyone cannot be right.

Humans tend to reason intuitively using simple decision rules.  This tendency can lead to oversimplifications when dealing with complex dynamics such as those present in technology industries.  In addition, investors may underweight the risk inherent in speculative technology stocks because humans tend to overweight the potential for low probability events to occur.  Lotteries take advantage of this human trait.

People habitually give too much weight to recent experience.  We also extrapolate recent trends that are at odds with long-run averages and statistical odds.  For instance, mutual fund companies often tout the performance of their top performing funds in order to attract new investors even though, if you take the top 10 percent of mutual funds in a given year, four out every five of the funds fail to remain in the top 10 percent in the following year.  The past is not always prologue to the future.  Over the short-term, historical extrapolation can prove accurate, but as the time horizon lengthens, predictions become far more dangerous and can miss critical turning points—economists consistently fail to predict reversals in economic growth or decline.  It is precisely this type of linear thinking that magicians take advantage of when performing magic tricks.  As a result of these tendencies, investors in volatile or cyclical technology companies are particularly susceptible to buying and selling at the most inopportune times.  Businesses give too much weight to recent experience when they put too much emphasis on short-term performance measures or the numbers.  Numbers are concrete and measurable, but in a dynamic business world marked by innovation and change, numbers can provide a false sense of security.  In the late 1970s and early 1980s, U.S. automakers were focused on the current numbers—the profitability of large cars—instead of envisioning future potential demand for smaller, more fuel-efficient vehicles.

Investors can counter their human fallibilities by first, being aware of them and second, striving to adopt opposite tendencies such as humility, caution and critical-thinking.  Regardless of one’s intuition, investors must take pains to minimize risk and conduct detailed fundamental analyses of prospective investments—due diligence is paramount.

Innovation and the First-Mover Advantage

An old management adage says: “You can always tell who the pioneers are.  They’re the ones with the arrows in their backs.”  The fact is, innovation and market leadership do not always go hand in hand. 
Products lacking patent protection can be reverse-engineered and copied or imitators can refine and make improvements on the original invention.  Imitators can avoid high research and development costs as well as the high marketing costs involved with getting a unique product accepted in the marketplace.  Tycoons from the robber-baron era such as Andrew Carnegie (steel) and John D. Rockefeller (oil) downplayed the importance of innovation in business.  Carnegie once observed, “Pioneering don’t pay.”  Rockefeller preferred to create wealth through the organization and deployment of power.  If a technology turned out to offer economic advantages, Rockefeller could exert his influence and purchase the company outright.  Nevertheless, some of the biggest growth stories of the last half-century have relied on technological innovations.  Polaroid Corporation developed new camera and film technology using polarized light.  The company introduced the instant-picture camera in 1947 and its stock rose 8,366 percent during the 1950s.  Another breakthrough was the xerographic copier, or photocopier, by Xerox Corporation.  For most of the 1960s it was protected from competition by patents and enjoyed a virtual monopoly in the photocopier market.  Xerox’s stock rose 5,146 percent in the 1960s.


One reason that pioneers often fail to make the most of their inventions is because they do not possess the requisite complementary business skills.  Beyond mere technological skills, commercial success requires functional capabilities such as product design, marketing, manufacturing, distribution, sales, service and support.  Imitators can capture the majority of an innovator’s market if followers have the crucial complementary assets, which an innovator may lack.

As an example, RCA developed the color television in the 1950s but, by the late 1960s RCA was unable to fend off Japanese rivals Matsushita Electric Industrial Company and Sony Corporation, which were able to manufacture high-quality color televisions in large volumes.  Drawing on their manufacturing prowess, Japanese firms came to dominate the global consumer electronics industry by the late 1980s.  Distribution can also play a key role in determining market acceptance because if a pioneering company cannot get shelf space for its invention then the lack of distribution will provide an opening for competitors.  Healthcare industries provide a good example of the importance of properly marketing an innovation.  Breakthrough drugs require specialized marketing through a focused sales force.  Sales representatives contact doctors, who select medicines by virtue of their ability to write prescriptions.  Drug companies must devote substantial resources to educating and informing medical practitioners about new pharmaceutical products.

British firm EMI (Electrical Musical Industries) pioneered a well-known innovation from the healthcare industry—the CAT scanner.  EMI was better known for being the recording label of The Beatles during the 1960s and this fact has led to confusion among some radiologists who think The Beatles creative work included developing the scanner—but I digress.  In the late 1960s an EMI engineer named Godfrey Houndsfield developed the CAT scanner while engaged in pattern recognition research.  He was able to produce cross-sectional images of the human body by employing a technology called computerized axial tomography.  It was the greatest advance in radiology since the discovery of X-rays in 1895.

EMI introduced the first CAT scanner into the U.S. market in 1973.  Although, EMI had no prior experience in the medical equipment field, the firm decided to build its own manufacturing facilities and handle the sales and distribution itself.  As a sophisticated diagnostic technology, the scanner required high levels of training, support and service.  Unfortunately, for EMI, the company had no experience with these key complementary requirements nor did EMI fully recognize their importance.  Also problematic was the lack of patent protection afforded the scanner invention.  As a result, late entrants such as General Electric and Ohio-Nuclear were able to make inroads into the market on the basis of their relatively superior skills in marketing, sales, service and support.  Late entrants to the scanner market also did a good job of advancing scanner technology through their own pioneering innovations.  While EMI’s CAT scanner met with initial success, by the late 1970s the medical electronics division of EMI was experiencing significant losses and by 1980, EMI was no longer the U.S. market leader.  Thorn Electrical Industries acquired EMI in 1979, and shortly thereafter, Thorn sold the CAT scanner business to General Electric.  The EMI case demonstrates that pioneers without the resources and complementary skills to exploit innovations are vulnerable to competitive threats.  A first-mover has an early advantage in its market but it takes more than great technology to realize ultimate market success.


Clayton Christensen’s groundbreaking 1997 book, The Innovator's Dilemma: When New Technologies Cause Great Firms to Fail, brought a fresh perspective to the topic of technology and innovation.  Christensen is a Harvard Business School professor who studied innovation in various industries including the hard disk drive sector and the mechanical excavator market.  His findings indicate that certain disruptive technologies can destroy even the greatest of companies.  While they begin as apparently benign innovations, disruptive technologies can eventually transform industries and inexorably alter the competitive landscape.  The Innovator's Dilemma is a warning siren of sorts—it serves to warn investors of the inherent dangers of investing in technology companies.

As Christensen describes it, a pioneering company develops a new technology that lacks the ability to address mainstream market needs but finds a use in small fringe markets.  However, technological progress leads to a phenomenon whereby product performance improvements increase at a faster rate than customers demand.  As a result, current mainstream, or sustaining, technologies eventually overshoot demand requirements while pioneering technologies eventually mature enough to match market needs.  In the hard disk drive industry, 3.5-inch disks were initially relegated to laptops but eventually, larger 5.25-inch disks (the reigning format) overshot the market and 3.5-inch disk technology improved enough to meet market needs and become the standard.

Disruptive technologies are characterized as unique and revolutionary in nature—the personal computer, for instance, shared these traits.  They may first appear at the low end of the market but, by the time they become disruptive, they can displace leading high-end products.  Disruptive innovations are often cheaper and present users with unique benefits when compared to conventional mainstream technologies.  Breakthroughs leading to disruptive technologies often require a large capital investment and many years of research and development—it took two decades of R&D before Japanese developers were able to market the video cassette recorder (VCR).  But successful innovators are rewarded with an enviable competitive position because, by the time a disruptive technology has reached mainstream markets the innovator has built an organization of sufficient scale and focus that their lead can be insurmountable.  As a consequence, firms cannot afford to wait until disruptive technologies have matured into mainstream competitive threats before orchestrating a competitive response.

In their early development, disruptive technologies do not appeal to most established firms because the innovations are relegated to small and specialized niche markets.  Since the ultimate applications of any innovation are difficult to accurately predict, pioneering technologies entail a degree of unavoidable risk that large enterprises typically shun.  Large established firms are focused on showing immediate results and small markets cannot fulfill the growth needs of these large companies.  In addition, large established players are often public companies that face short-term performance pressures from the financial community.  This pressure makes it difficult to sacrifice current performance for riskier and more distant prospects.  Furthermore, market leaders have a natural tendency of dwelling on old models that have been profitable in the past.  While an established leader possesses a cost structure that works in high-end, mainstream markets, the same cost structure will frequently fail to achieve profitability in emerging low-end markets.  When established firms have invested the bulk of their organizational effort into mainstream markets, pioneering innovations do not mesh with the firm’s resources, values and processes.  These leaders are focused on sustaining the performance of current technologies in an incremental and evolutionary manner instead of exploring the opportunities for revolutionary innovations.

But in failing to see the possibilities of future technologies, current leaders are sowing the seeds of their eventual destruction—they ignore the acorn that one day becomes a giant oak.  History has proven that a fundamentally novel technological architecture can make mainstream technologies obsolete and destroy the usefulness of current technological competencies.  Current industry leaders may focus on competition from another product in the same category when the real threat is from disruptive innovations that render the boundaries defining the category obsolete.  The Internet is an example of a disruptive technology that redefined business categories.  For market leaders, focusing on current customer needs has led to past success but, in their early manifestations, disruptive technologies do not meet current customer needs.  As a result, companies risk becoming captive to the needs of their customers while competitors slowly build expertise in future technologies—the ironic implication is: listening to customers can lead to failure.  The term, innovator’s dilemma, refers to the fact that disruptive technologies appear unappealing in the early years even though early investment is critical for long-term success.

For investors, the implications of The Innovator’s Dilemma are troubling.  Even the best-managed firms—the ones that invest in new technologies, listen to their customers and keep their ears to the ground are susceptible to disruptive technologies.  Therefore, if disruptive technology may eventually destroy a company’s market, trying to value the firm by discounting future cash flows becomes an exercise in futility.  While all industries are affected by disruptive technology, high-technology industries are more exposed to this type of risk.  For instance, disk drive technology experiences more frequent and more profound disruptions than internal combustion engine technology.  Therefore, the investment implications are, minimize your exposure to technology industries and be aware of the tremendous degree of risk involved in high-technology investments.

Scale and Standardization in Technology

Issues of scale and standardization are key to understanding how technology companies achieve enduring competitive advantages.


Although some great inventions are made by engineers tucked away in garages, as technology industries have matured, size and manufacturing scale have become important advantages.  Just to get a high-technology product to market, it can take a huge capital investment—Japanese companies poured around $10 billion into developing active-matrix displays for flat screen panels on laptop computers.  In the 1960s, IBM dominated the computer industry with its line of System 360 computers.  But in order to develop the System 360, IBM had to invest $5 billion and hire tens of thousands of employees.  The huge bet paid off handsomely for big blue as data-processing revenues grew from $1.2 billion in 1963 to $8.7 billion by 1973.  As these example show, it takes tremendous resources to develop high-technology products.  The huge scale required for developing technology products acts as an entry barrier to keep small competitors out of the industry.  Few businesses can afford to invest in large, complex technology projects with a payoff that is five or ten years down the road.

In recent times, Intel has emerged as a technology company of formidable scale and resources.  In the year 2002, a new Intel fabrication facility for silicon wafers cost over $2 billion.  Massive engineering and manufacturing resources allow Intel to fend off competition by producing chips more efficiently and at a lower unit cost than its rivals.  Intel manufactures chips on such a scale that it has helped drive the cost of computers down so that they are affordable for the mass market.  As for the competition, Intel’s strongest competitor, Advanced Micro Devices (AMD), has been unable to generate profits on a consistent basis.  AMD lacks the financial resources to compete with Intel on an equal footing and must engage partners to share the cost burden of building silicon wafer facilities.  Other Intel competitors such as Cyrix, National Semiconductor, Centaur and Rise Technology have either merged or gone bankrupt.

Software companies such as Microsoft incur massive costs to develop operating systems and applications.  Fixed costs for developing an operating system such as Windows are so high that only the largest companies are capable of meeting the challenge.  However, once the software product is developed it can become remarkably profitable because the marginal costs of production are very low—Microsoft has reaped a windfall licensing Windows.


Successful technology investing requires an understanding of the importance of industry standardization.  Although it is very difficult to predict what design technology will eventually become the dominant industry standard, investors must recognize the powerful influence that standards have over a firm’s fortunes.  For instance, in the early automobile industry, producers of steam cars eventually went bankrupt after the internal combustion engine became the dominant design for automobiles.  Nokia Corporation emerged as a leader of the cellular telephone market as a result of its strategy of supporting a unified European technology standard known as GSM (Global System for Mobile Communications).  By contrast, American companies sponsored a handful of incompatible standards.  Japanese firm Matsushita pioneered the VHS videocassette recorder in the 1970s.  Matsushita promoted the VHS standard by serving as an original equipment vendor (OEM).  It supplied other firms who could brand and distribute VCRs under their own brand name.  As a result of Matsushita’s strategy, VHS displaced Sony’s Beta and RCA’s videodisk formats to become the dominant industry standard.

In the personal computer industry, Microsoft and Intel are good examples of how leadership is a result of controlling industry standards.  Microsoft makes the dominant operating system for desktop PCs as well as the leading business application software programs.  Intel has pioneered the x86 architecture, which is the dominant PC microprocessor design.  Competitors in the microchip industry have found it difficult to compete with Intel’s proprietary architecture.  Intel rarely licenses its x86 architecture and most companies that have tried to develop chips based on Intel’s architecture have failed.  Even when dominant firms control industry standards, technology has a tendency to develop in cycles or generations so a leader must remain vigilant if it is to retain leadership.  Despite their envious positions, both Microsoft and Intel have defended their leadership positions by continuing to identify and exploit technological trends.  They remain moving targets for their competitors—always innovating, always improving.  Microsoft progressed from the MS-DOS operating system to Windows.  Intel introduced the 286 microprocessor in 1982, the 386 in 1985, the 486 in 1989, the Pentium in 1993 and the Pentium II in 1997.

Standardization relates to scale in an important way because scale and market leadership help firms to advance industry standards.

A Bright Spot: The Software Industry

If you have read this far you may be questioning if there is any legitimate hope of making money in technology industries.  While most technology firms fail to live up to expectations, there are a few leading tech companies that have succeeded and have generated enormous returns for their shareholders.  Some of the best performers have been software companies.  Firms such as software giant Microsoft, database maker Oracle and German software company SAP have demonstrated that it is possible to profit from investments in leading software companies.  (SAP makes enterprise resource planning [ERP] software that integrates functions such as finance, production and human resources into massive information systems at large organizations.)

Before software firms like Microsoft and Oracle emerged as industry behemoths, IBM had developed the personal computer industry’s dominant architecture and was the undisputed leader.  However, in the decades since the PC’s invention, software companies have earned a large portion of the industry’s profits.  While IBM’s PC business has been marginally profitable—most of the profits generated from sales of PC hardware have accrued to companies such as Intel and Dell—Microsoft and leading application software companies have generated tremendous results.  As of late 2002, Microsoft alone had a market value more than twice as large as IBM’s.

The software industry exhibits a high degree of concentration—it is dominated by a small number of market leaders with relatively high market shares.  For instance, SAP dominates its market and was able to claim, by the mid-1990s, that it had higher research and development spending than its competitors had revenues.  The high degree of concentration can be explained by two powerful phenomena—network effects and switching costs.


Robert Metcalfe founded the networking company 3Com and invented the Ethernet protocol for sharing information through computer networks.  Metcalfe is also known for Metcalfe’s Law, which states that the usefulness of a network equals the square of the number of users.  Or, alternatively, the value of a network grows at the same rate that the square of the numbers of its users increases.  In practical terms, the law implies that as the number of computers in a network increases, the number of possible cross-connections grows at an exponential rate—the usefulness of the network increases faster than the increase in the number of users.  Metcalfe’s Law helps explain how the Internet was able to grow as rapidly as it did during the 1990s.  As the Internet community expanded, the network became far more valuable as businesses and consumers linked into the system both accessing and sharing resources.  When your family, friends and co-workers have e-mail addresses there is a strong incentive to join the network.  As the largest competitor in the online auction field, eBay enjoys network effects because buyers want the most choice and sellers want the most buyers—when network effects are present the largest network wins out.

But network effects are not limited to computer networks and auctions.  Network effects occur in industries where one company provides an open architecture that other companies support and integrate into their own products.  When vendors along the supply chain standardize on a specific platform, it perpetuates this dominant architecture because all participants have a stake in promoting the chosen standard.  If the open architecture is also proprietary to the company that creates it, there is an opportunity to achieve sustainable competitive advantage.  For example Intel’s x86 microprocessor architecture is proprietary and Intel can control its development and microchip pricing.

The software industry experiences strong network effects because buyers will choose a common platform that has established itself as a standard.  Software customers deploy their information systems in a networked environment and they tend to adopt standards in order to minimize compatibility problems and maximize portability.  As a result, they tend to be drawn toward products with a strong network of supporting infrastructure and many existing users.  The Microsoft Windows operating system has a market share over 90 percent in the desktop PC market and benefits from strong network effects.  Corporate users are attracted to the Windows standard because employees are familiar with the Windows operating environment and compatible applications.  A standard operating system also allows diverse organizations to share information through compatible file formats.

Because the installed base of Windows users is so large, independent software vendors choose to develop applications for Windows instead of competing platforms.  Consequently, there are a large number of compatible applications available to customers who choose the Windows operating system—Windows supports over 70,000 applications.  More applications are written for Windows than for any other operating system including the Apple Macintosh and Linux systems.  In 1994, IBM attempted to compete with Windows by releasing the OS/2 Warp operating system.  Despite a massive investment by IBM, OS/2 Warp was a flop.  The system failed to achieve significant market share because it was unable to gain widespread support from independent software vendors.  Microsoft had codified its intellectual property into software libraries that served as component tools for the development community.  When creating application software, third-party developers could access sophisticated Windows components leveraging knowledge and functionality built up over the years.  Replacing such a repository was a gargantuan and complex challenge—a challenge IBM was unable to meet.  Furthermore, developers had no incentive to write applications for the new OS/2 Warp platform, which had a tiny installed base, when there was a massive entrenched base of Windows users.  The breadth of applications supporting Windows fueled customer demand for Windows and as the installed base grew, independent developers wrote even more applications for Windows thus creating a self-reinforcing cycle known as a positive feedback loop.

In addition, customers coalesce around leading software products because they perceive them as less risky.  Implementing corporate information systems is an expensive and risky undertaking—it is estimated that American business and government organizations spent $81 billion for cancelled software projects in 1995.  Even implementations that succeed can experience missed deadlines and severe cost overruns.  Therefore, it is in a customer’s best interest to reduce risks by choosing dominant technologies that have stood the test of time and have proven their effectiveness and reliability.

When a business or consumer makes a software investment they also want to know that their product will be supported over the long-term.  In particular, customized software applications have a tendency of hanging around for a long time—even decades.  The Year 2000 problem is a good example of how organizations can underestimate the shelf life of their technology systems.  For decades, software developers were able to conserve memory by eliminating storage of the first two numbers of the year.  At the time, they were aware that a new century would cause problems but they never expected their systems to last that far into the future.  The point is—in order to guarantee that there will be future support for their planned software systems—customers choose standard, market-leading products with a strong network of support.  Businesses choose software from companies such as Microsoft, IBM, Oracle and SAP, in part, because they can be assured that in the distant future these leading companies will likely still be around to support their software.

If dominant software companies fare magnificently than there must be a flip side to the equation for companies with less dominant positions.  In fact, laggards in the software industry have a rough road to hew.  A study by the management-consulting firm McKinsey & Company showed that only 13 percent of struggling software companies were able to fully revive their operations.  In other words, seven out of eight software turnarounds fail.  The same factors that reinforce one another in successful software companies also work to prevent struggling firms from regaining their ground.  Network effects on lagging software products are disproportionately weak compared to effects on dominant products.  Furthermore, once competitors gain market acceptance, switching costs make it difficult for a firm to win back lost business.  Customers tend to head for the exits at the first sign of trouble because they fear relying on a vendor that may not be around in the future to provide upgrades and support.  Employee retention can also be a problem at declining software companies.  Many software firms rely on stock options to attract and retain talented personnel.  But when a troubled firm’s stock price declines, employee stock options lose their value and a large part of compensation gets reduced.  As a result, a failing software company can experience severe labor losses including key employees that are a software company’s most valuable assets.


Once customers adopt specific software systems, they are faced with switching costs if they want to consider changing products.  A company that has made substantial investments training employees to learn one software system will have to retrain their work force if they decide to switch systems.  And there are costs associated with integrating the new system with other existing company systems.  Training and integration costs such as these are both expensive and time consuming.  In addition to direct switching costs, when firms switch products they lose the value of their investment in the existing system.  All the value from the money sunk into training, implementation, integration and testing gets scraped.  Therefore, switching costs provide an entry barrier for potential competitors while perpetuating the leadership of dominant firms.  Switching costs also reduce marketing expenses for entrenched products because existing customers are less likely to consider competing products.

In the case of Microsoft Windows, switching costs would include the expense, time and effort required to learn how to use the new operating system, the cost of acquiring a new set of compatible applications and the task of replacing files and documents associated with the previous applications.  Needless to say, most customers in the desktop PC market are unwilling to absorb the huge switching costs that a change from Windows would entail.

Switching costs and network effects combine to give first-movers a distinct advantage in the software industry.  Once a leading standard emerges, it will benefit from increasing advantages while laggards will fall further behind.  It is a winner-takes-all phenomenon where the leaders that emerge, such as Microsoft, can come to dominate the industry.


As a general rule, investing in industry leaders is good advice because a disproportionately large share of industry profits accrues to the best firms.  In software industry sectors however, leadership is crucial because of the influence of network effects and switching costs.  When network effects are strong and switching costs high, the dominant player can reap tremendous benefits—since there are few viable alternatives to Windows, Microsoft can set the price of its Windows software licenses well above its costs or the price that would be charged in a competitive market.

Besides the Windows operating system, Microsoft’s makes the Office suite of business applications. 
The software suite is a classic example of a product with sustainable competitive advantages.  Microsoft Office has a market share above 90 percent and benefits from strong network effects and high switching costs—businesses have years of institutional knowledge locked into Office’s proprietary file formats.  In the 1990s, Microsoft expanded into new markets such as the Internet with MSN, home entertainment through the Xbox video game console and mobile device software for personal digital assistants.  But in a typical quarterly report released in 2002, Microsoft’s earnings were heavily dependent on the two divisions containing Windows and Office.  In fact, the remaining four divisions collectively lost hundreds of millions of dollars.  In short, Microsoft’s profitability depends on dominant products that experience competitive advantages based on network effects and switching costs.

When Microsoft had its initial public offering in 1986, it had a market capitalization of $525 million.  As of November 2002, Microsoft was valued at $311 billion and was the most valuable company on the face of the earth.  While it has never appeared to be a cheap stock based on traditional valuation criteria such as price-to-earnings ratios or price-to-book value, history has shown that the stock market has consistently underestimated Microsoft’s prospects.  Traditional economic models indicate that as markets mature, profitability should decrease.  But firms benefiting from network effects can continue to increase their market shares, profit margins and earnings even when their industry markets mature.  Investor’s tend to consistently underestimate the value of dominant software firms such as Microsoft because they apply traditional valuation criteria to companies in anomalous situations.  Dominant firms enjoying sustainable competitive advantages warrant premium valuations well above the market average.  While few firms possess truly sustainable advantages, the ones that do can generate outstanding results.

Bricks and Mortar and Technology

As founder and chairman of the Virgin Group of Companies, Sir Richard Branson built Virgin into a global corporate empire encompassing over 200 companies with worldwide revenues over £3 billion as of 1999.  In his autobiography he recounted an anecdote that made him aware of the importance of using technology in everyday life.  In Virgin’s early years the company was focused on the music business, so when Branson noticed his children were spending less time listening to music and more time playing video games, he took notice.  Branson eventually decided that interactive technology had a bright future and purchased the foreign rights for Sega video games.  And as the market for video games developed Branson made a substantial profit on his Sega investment.

The best companies incorporate technology into their everyday business, using it to coordinate the flow of information and eliminate manual activities.  These top-notch firms are keenly aware of technology’s potential to create opportunities, streamline business processes and reduce costs.  For example, Wal-Mart applies technology to reduce distribution costs and maintain first-rate inventory systems.  In addition, companies such as Home Depot, Citigroup, Federal Express, Dell, Cisco Systems and SAP have demonstrated expertise at exploiting technology.

Home Depot invests hundreds of millions of dollars in technology every year.  For example, the firm uses wireless pen-based PCs to track inventory and place orders from suppliers.  Product demand is forecast based on a database containing the sales history broken into half-hour increments.  And when goods need to be ordered these mobile units can send purchase orders directly to the manufacturer.  Home Depot claims the ordering technology has eliminated one full-time position per store and saved the chain $22 million a year.

Federal Express ships over 3 million packages a day and relies on technology to track the progress of its shipments.  When a Fed Ex package changes hands, employees enter a tracking number into portable devices called SuperTrackers.  Data then gets routed to the company’s mainframe computer system where it is available to customer service representatives who can pinpoint the location of a package on a delivery route.

Networking company Cisco Systems owes a significant part of its success to its integrated information technology infrastructure.  In 1994 Cisco completely replaced its back-office information system, which was failing to serve the company as it continued to grow.  After successfully implementing the new ERP system, Cisco spent the following years building off of this foundation by standardizing data and processes throughout the firm.  Cisco streamlined its supply chain and connected electronically with customers resulting in lower inventory levels and higher inventory turnover.  In addition, by 1999, nearly 80 percent of Cisco’s sales came from the Internet.

Dell Computer Corporation is perhaps the best example of a firm that embraces technology and achieves competitive advantages through its successful implementation.  The company possesses an IT architecture that is integrated with all of its key corporate functions.  Dell also uses the Internet to reach millions of potential customers at a low marginal cost.

Information technology plays a fundamental role in Dell’s build-to-order production process.  At Dell, business tasks such as order taking, procurement, logistics, production, order fulfillment, service and support all rely on the use of IT for proper co-ordination.  Dell is sometimes referred to as a virtual company because it outsources many of its non-strategic functions.  For instance, computer components are outsourced and some Dell notebook computers are entirely outsourced.  In general, Dell’s key strategic activities include assembling components into computers and managing the customer relationship.  The entire network of suppliers, partners and customers is integrated and managed through information technology and real-time information sharing.  Vendors in the supply chain have real-time access to Dell’s customer order information and technical specifications through integrated IT systems.  This information sharing allows suppliers to monitor orders and respond appropriately to changes in demand—they can quickly build and ship inventory to meet demand instead of waiting to receive a purchase order.  Dell’s integrated technology systems allow the firm to substitute information for inventory—it has the lowest inventory levels in the industry.  Dell’s ability to minimize inventory is a big advantage because of the nature of the computer industry.  Declining component prices mean that excess inventory loses value at an estimated 10 percent a month.  Furthermore, the newest technologies have the highest margins because of constant technological advances and performance improvements.  Therefore, Dell’s lean inventory and ability to respond to demand quickly give the company a formidable competitive advantage.

If we study how the top companies use technology we find that they refrain from hording information in the executive ranks.  Instead, they share information with employees, partners and suppliers.  The Dells and Wal-Marts of the corporate world use technology to empower their organization at every level.  Sam Walton, Wal-Mart’s founder said it best:

“Communicate everything you possibly can to your partners.  The more they know, the more they’ll understand.  The more they understand, the more they’ll care.  Once they care, there’s no stopping them.  If you don’t trust your associates to know what’s going on, they’ll know you don’t really consider them partners.  Information is power, and the gain you get from empowering your associates more than offsets the risk of informing your competitors.”

These examples show that many leading companies in both traditional and technology industries are adept at using and exploiting technology for their competitive advantage.  In summary, when you consider the role of technology in business and investing, you may want to consider changing the focus to companies that do not necessarily produce technology, but that first and foremost are great users of technology.


A discussion of technology and investing would be incomplete without an examination of the Internet’s impact on the business world.  As a technology, the Internet was made possible by complementary advances in telecommunications and computing technology.  But it was the end of the Cold War that marked the beginning of the Internet, as we know it.  Shortly after the end of the Cold War, in 1990, the U.S. Department of Defense disbanded the ARPANET, which was a precursor to the current Internet.  The ARPANET was replaced by a network—managed by the National Science Foundation (NSF) in the United States—called the NSFnet backbone.  In 1991, the NSF removed restrictions on private access to this Internet backbone and by 1994 the Internet was opened for widespread commercial use.

The biggest winners from the Internet’s explosive growth have been consumers and certain businesses—including some old-economy firms—that have re-tooled their operations to incorporate the Internet.  For example, before the emergence of the Internet, the process of syndicating a large corporate bank loan among many banks was a tedious and time-consuming process marked by the distribution of hundred-page documents and expensive courier charges.  Nowadays, banks use the Internet to arrange syndicates.  Banks can access a Web site to download the offering document, email questions and co-ordinate the entire process.  It has been estimated that the new Internet system reduces the amount of time it takes to close a syndication deal by 25 percent.  Another example of increased efficiency comes from consulting company Booz Allen Hamilton, which has used the Internet to bill clients in the U.S. federal government.  In the process, the firm has eliminated paper handling on its monthly billings saving $150,000 a year and reduced the time it takes to get paid by 30 percent.  Business-to-business electronic commerce uses the Internet as a low-cost communications network to link with partners and suppliers worldwide.  Firms can find the lowest cost supplier on the Internet, integrate their supply chain and automate tasks such as invoicing, purchasing and inventory control.

However, many industry sectors have seen profitability suffer because consumers have used the Internet as a powerful comparison-shopping tool.  Internet users can rapidly retrieve product information and prices and then quickly compare multiple online sellers and select the lowest-priced offering.  The Internet puts power in the hands of users—it empowers them and leads to lower prices and more efficient markets—in fact, the Internet has had a deflationary impact on the economy.  For businesses, the effect on the bottom line has often been negative.  Transparent pricing on the Internet has led to global competitive pressures and declining profit margins in certain industries.  The list of losers includes shopping malls, executive recruiting firms, bookstores, retail music stores and the newspaper industry.

Ultimately, it is the users of the Internet that seem to derive the most benefit—whether they be consumers who purchase goods online or businesses that use the Internet to overhaul and streamline their back office operations.

Technology’s Influence on Industry and Society

Renowned French historian Fernand Braudel wrote about the history and economy of the Mediterranean region.  He explored how technological innovations such as pottery and seafaring affected society and shaped people’s lives.  According to Braudel, “The history of technology is that of human history in all its diversity.  That is why specialist historians of technology hardly ever manage to grasp it entirely in their hands.”  Indeed, technological change affects far more than just economic growth and business productivity.  Innovations can create entirely new markets or redefine existing ones.  But innovation also drives social change—the Internet has spread American culture and values such as democracy, capitalism and political liberalism throughout the world.  Some observers fear that the Internet will weaken social connections resulting in rootlessness and inequality.  While it has empowered individuals, the Net has also weakened the role of centralized powers such as governments.  Technological innovation can produce unexpected social consequences and, as most technologies are morally neutral, the end results can be both for good or ill.


Technology is meant to serve us—to provide a benefit or solve a problem.  Unfortunately, when new technologies are unleashed on society, they may carry additional unintended consequences that are not entirely positive.  Technological innovation and change can take place at one rate while the human ability to absorb the change and modify behavior can take place at a far slower rate.  Personal computers have been around since the 1970s but it was not until the late 1990s that they became a fundamental part of many people’s lives.  Sometimes technology fails to take root because of human factors and other times the technology stays and humans must cope with it.

Despite the fact that we live in an era of global communication, technology can often have an isolating effect.  The evolution of radio technology illustrates the point.  Early radios were bulky appliances filled with vacuum tubes that served as the centerpiece of family entertainment.  With the advent of the transistor, radios became miniaturized and could be listened to in private.  Further advances led to the portable Sony Walkman that could be stowed in a large pocket.  Additional relevant examples abound:  Printing technology created the portable book that allowed people to read in private; automated Teller Machines (ATMs) reduced the need to visits bank branches.  More sophisticated technologies such as elaborate video games can draw people into alternate realities—make-believe worlds that alienate game players from genuine social contact.

In many respects, information technology perpetuates technological isolation.  Even when connected to the Internet, the personal computer is still designed for use by a single person.  Internet shopping reduces the need to venture out into the world and interact with people.  Information technology allows the workplace to invade the home without offering the warm interaction of coworkers.  Writing in Shift magazine, Richard Longley put it succinctly: “Transistors allow the human tribe to fragment into constellations of electronically connected solitudes.”

In work environments, technology can result in a loss of independence and control for employees.  When managers use information technology to constantly record, measure and monitor employee performance and behavior, potential benefits come at the cost of the work force’s independence and control.  In addition, while innovations can transform the workplace and redefine job roles, they can also diminish the human element of work when increasingly sophisticated technologies substitute for human knowledge and expertise.  Employees may feel like extensions of machines instead of individuals with unique abilities.  In such scenarios, technology results in low job satisfaction and a profound loss of meaning and identity for workers.

Technology increases the pace of everyday life.  Personal computers, the Internet, fax machines, e-mail and cellular phones have all accelerated the pace of business while increasing work complexity.  And when people have difficulty coping with the pace of change or level of complexity they tend to experience a state of anxiety. As technology comes to dominate our work and home lives, the increased isolation, loss of control and stress can contribute to serious psychological problems.  Negative consequences include insomnia, apathy, depression, nervousness exhaustion, social anxiety, short tempers, impatience, increasing rudeness, escapism through drugs and adversarial conduct.  Technology has also demonstrated a tendency to polarize human behavior in the form of both increased aggression on the one hand and increased shyness and withdrawal on the other hand.


Although predicting how technological innovations will change society is a difficult challenge, an investor needs to, at least, be aware of the potential for secondary social and psychological effects.  These influences may be less obvious than direct economic effects but they are important nonetheless because, in the long run, social trends often have profound impacts on business and industry.  For example, negative psychological effects caused by technology affect the health care industry.  The pharmaceutical industry has been at the forefront of the battle against clinical depression and anxiety and society’s increasing use of technology may play a part in determining demand for such drugs.  In 1988, Eli Lilly and Company introduced a new class of antidepressants with the U.S. launch of the drug Prozac.  The drug became a lucrative blockbuster for Eli Lilly generating sales of $28 billion by 1999.

On a more encouraging note, people have found effective ways to cope with technology in their lives.  In white-collar, clerical and administrative settings, labor requires more brainpower and less physical exertion.  Unlike farmers or factory workers in the industrial era who just wanted to lie down at the end of a grueling workday, people in technology-intensive jobs need a physical outlet.  Sedentary workers in the information age seek to reduce stress through physical activity and participate in leisure activities where they can use their hands and bodies to counter-balance the mental energy exerted at work.  As a result, the growth of computers in the workplace has been matched by growth in the health and fitness industry and in activities such as jogging and weight-lifting.

Another secular trend to benefit from technology has been the growth of the do-it-yourself home renovation market.  Beyond the mere physical aspect of exerting oneself and taking pride in building a new addition on to a home, there is a certain tangible and concrete benefit to being able to see, touch, feel, smell and even live in a home renovation project.  Working with software and computers requires a great deal of abstract thought and this can present users with a sense of remoteness.  Consequently, it is easy to understand the appeal of a home renovation project that engages the senses and leaves a lasting impression.  For investors, growth in the home renovation market fueled Home Depot—one of the greatest stocks of the 1980s and 1990s.  Launched in 1978, Home Depot went public in 1981.  Between its initial public offering price in 1981 and its price in February 2000, Home Depot’s stock increased 169,900 percent or a multiple of 1,700 times the IPO price.

In the automobile industry, internal combustion engine technology led to the mass adoption of automobile travel.  And—just as computer technology affected the health and fitness and home renovation markets—the internal combustion engine produced far reaching economic and social consequences.  Combined with the economic boom following World War II, the automobile spurred a home-building boom and tremendous development of suburban areas in the United States.  Those to profit included the construction industry, mortgage lenders, farmers that owned real estate on city outskirts, and shrewd investors—the money manager for the Vatican City, Bernadino Nogara, purchased huge parcels of land in American suburbia.

Television technology has demonstrated direct industry effects as well as more subtle social effects that influence the business world on an entirely different level.  Television transformed the movie industry by creating new demand for filmed entertainment and giving film libraries a key distribution outlet.  The studios eventually grew into today’s giant media conglomerates such as AOL Time Warner, Disney and News Corp.  For example, back in 1953, Columbia Pictures had a market capitalization of just $9 million.  But by 1989, Coca-Cola sold Columbia Pictures to Sony for $3.4 billion.  Yet television’s effects did not end with the Hollywood studios.  By creating large nationwide viewing audiences, television was able to foster a uniformity of popular taste that consumer product companies were able to benefit from.  Firms such as Procter & Gamble and Anheuser-Busch used television technology to create and reach a mass market through advertising.  Furthermore, satellite technology and the Internet have enabled global communication—a global village as the communications theorist Marshall McLuhan has called it.  We live in a world where there is instantaneous distribution of information and culture.  People throughout the world can watch the exact same television programming or access the same Web sites.  This technology homogenizes popular tastes and social trends throughout the planet leading to similar demand patterns and fashion trends.  For business, global convergence creates a worldwide market for standardized consumer products allowing firms to benefit from economies of scale as fixed costs are spread over a larger base.

The social effects of technological change may be less direct and less obvious than economic effects but they are important nonetheless because, in the long run, social trends often have profound impacts on business and industry.


Multibillionaire investor and philanthropist George Soros was born in Hungary of Jewish parents.  Soros was 14 years old when the Nazis invaded Budapest in 1944.  His family narrowly escaped the Holocaust by moving frequently and assuming false Christian identities.  After the war, Soros suffered through further totalitarianism when the Communists rolled into Hungary.  But in 1947, at the age of 17, Soros was able to escape from Communist Hungary to the West.  Arriving in Britain, the teen-aged Soros had no money, no contacts and no friends.  For Soros, these early personal experiences developed a worldview born of adversity and harsh reality.  The events of Soros’s youth stripped away his illusions about political dogmas and idealized human institutions.  In recent years, Soros has spoken about philosophical issues and, in particular, the importance of recognizing one’s fallibility.  Writing in the Atlantic Monthly in 1997, Soros pronounced:  “I believe in the open society because it allows us to develop our potential better than a social system that claims to be in possession of ultimate truth. Accepting the unattainable character of truth offers a better prospect for freedom and prosperity than denying it.”

As we have seen in examples from Behavioral Finance and business history, humans and human organizations are fallible.  The theme of fallibility applies particularly well to technology investing—stories of disruptive technologies and corporate decline should serve to strip away the illusions of idealistic technology investors.

There is, in fact, a theoretical basis for the special difficulties that investors face.  In the academic world, investing is a component of finance and finance is a sub-discipline of economics.  And, unlike hard sciences such as physics and chemistry, economics is considered a soft science much like psychology and sociology.  The Greek mathematician Pythagoras developed his Pythagorean Theorem over 2000 years ago yet it still holds true to this day.  By contrast, economic theories come and go without the same verifiable progress that the hard sciences have shown.  Natural phenomena demonstrate a degree of certainty and predictability that social phenomena lack.  The stock market is a human institution where the scientific method does not apply and universal laws are difficult to establish.  For instance, behavioral finance has discredited the theories of rational expectations and efficient markets.  Moreover, in economics, there is a disconnect between market participants’ perception of reality and the true facts.  Investors make decisions based, in part, on these misperceptions and the buying and selling stemming from these flawed decisions then influences the market itself.  Therefore, it is impossible to discern ultimate truths in a situation where our own misperceptions influence the events.

The shrewdest investors have a perspective on investing that is congruent with reality.  These top investors have an ability to think rationally and objectively while minimizing misperceptions.  But equally important to successful investing are the traits of humility and pragmatism.  Before investors can reach their potential they must first recognize their limitations.  They must recognize the full complexity of reality.  And they must recognize when their understanding is imperfect and when rationality cannot explain uncertainty.  But once investors acknowledge their fallibility and accept it as an article of faith, they can begin to get a genuine grasp of what is possible.  As George Soros contends, “a belief in our own fallibility can take us much farther.”

While no investor can foresee all eventualities, the eventualities are far harder to predict in the dynamic area of technology investing.  A proper investment valuation requires a discount of future cash flows to the present.  However, it is impossible to accurately forecast the long-term performance of most technology firms—the future is too unpredictable.  When a business is exposed to innovative technology, it must contend with many changing variables that are beyond the firm’s ability to control.  Even the best firms are vulnerable to disruptive technologies—they can plan countermeasures but they cannot stop the train once it has left the station.

But an investor can make a decision to minimize the unpredictable and disruptive effects of technology by avoiding high technology firms.  They can focus on firms and industries that are easy to understand—firms that have strong competitive positions in stable industry environments with high entry barriers. 
At the same time, investors can look for companies that are consumers of technology—many leading firms have generated tremendous shareholder wealth by, in part, using technology for competitive advantage.  Therefore, investors should consider a firm’s ability to exploit technology when evaluating investment prospects.

When investors do decide to purchase shares in technology firms, they should pick their spots to wait for those rare opportunities when the uncertain rewards are worth the risk.  To make the grade, technology firms should demonstrate requisite complementary skills such as marketing and manufacturing prowess.  Investors should be aware of the benefits of key success factors such as market leadership, large scale and control of industry standards.  As well, investors should seek out firms benefiting from strong network effects and high switching costs.

Technology firms have an alluring quality—there is an element of excitement and challenge in the uncertain realm of technology investing.  And technological progress depends on investors who are willing to take risks and invest in technologies with uncertain futures.  However, in general, I would recommend leaving those uncertainties for other investors who are willing to absorb the risk.

It is a rare individual who can demonstrate absolute intellectual honesty, subverting his or her ego and tendencies of wishful thinking.  But the potential payoffs to be reaped from this form of self-discipline far outweigh the sacrifices.

Any person considering an investment should seek independent advice from an investment advisor on the suitability of the particular investment.  Blog postings are for educational purposes only and do not constitute a recommendation for the purchase or sale of any security. The information is not intended to provide investment or financial advice to any individual or organization and should not be relied upon for that purpose.