AEIdeas

The public policy blog of the American Enterprise Institute

Subscribe to the blog

Discussion: (28 comments)

  1. The downside here is that a slowdown in price declines has been accompanied by a slowdown in tech investment“…

    Two problems I see here…

    The cost of chip fabrication has gone up…

    The price of technology has gone down…

    Still there are folks who don’t think the tech gravy train is really slowing down that badly…

    CE Industry Revenues to Reach Record-High $209 Billion in 2013, According to CEA

    Arlington, VA – 01/08/2013 – Revenues for the consumer electronics (CE) industry are projected to grow nearly three percent, reaching a new record-high of $209.6 billion, according to the semi-annual industry forecast released today by the Consumer Electronics Association (CEA)®. The forecast also shows 2012 industry revenues reached $204 billion, up five percent from the previous year. CEA President and CEO Gary Shapiro announced the forecast in his opening remarks today at the 2013 International CES®, the world’s largest annual innovation event.(there’s more)

  2. Ken Royall

    It’s a pause driven by a slow economy. It isn’t the cause of the slow economy. Look around, do companies have the “ultimate” systems in place that deliver the maximum amount of productivity possible? Not even close.

    If we ever get the economy back on track investment in tech will increase driving further innovation.

  3. Michael Macedonia

    It also depends on what you measure and what you call IT. The PC business is stagnant. But, the IT industry is moving to cloud and mobile (tablet, smartphone), and embedded computing (e.g. http://seekingalpha.com/article/1104561-intel-busting-the-mobile-margin-myth) enabling major gains in productivity. Drivers don’t get lost and now take optimal routes to save fuel. Small manufactures use cloud simulation services to make safer, effective, cheaper products. Computing is now part of everything and the enabler of innovation.

  4. David P. Schwarz, OD

    It could be that the market has bee distorted by gov’t fiat. Electronic Health Records were mandated by the Bush Administration. The mandate is now being realized due to the timetable in the mandate. Without this mandated switch to EHR, I would not have increased my computer, LAN and software investment by 300 to 350%. Then there are the many IT man-hours I would not have chosen to pay for to integrate my existing equipment into my new EHR system. Mandated PC/LAN/software purchases by healthcare providers, hospitals and clinics across the country are causing an artificial demand spike leading to sectoral inflation. This effect will likely dissipate in 2 years or so as the mandate is fulfilled.

    1. Electronic Health Records were mandated by the Bush Administration“…

      Could you just drop your employee health benefits and not have to deal with all that mandated nonsense?

      1. The OD in his name would indicate he is an optometrist. Thus, he can only escape the mandate by dropping his customers.

        1. The OD in his name would indicate he is an optometrist“…

          I understood that part jethro, hence the reason I used the term, ‘employee‘…

          1. In order to stay in business you have to be part of an Accountable Care Organization, in order to do that, you have to have EHR that connect you to the rest of your organization.

            Also you wouldn’t believe how much money is wasted because records are kept in files in some office somewhere. Imagine paying for 7 different MRIs because the right hand doesn’t know what the left hands is doing even within the same hospital.

          2. In order to stay in business you have to be part of an Accountable Care Organization, in order to do that, you have to have EHR that connect you to the rest of your organization“…

            OMG! Here I thought having file quarterly tax returns was over the top!!

            Thanks for that info chris

  5. GogogoStopSTOP

    Personally, I thought I’d be getting to work in my flying car as GM predicted we’d be doing by the 1990′s. But now I’d be satisfied if my car would drive me to the ‘bot serving Big Mac’s at the drive thru. Who’d a thunk-it?

    Oh, and I’m still waiting for the Japanese to take over the computer industry. And, lastly, electronic ‘computers’ have been with us for nearly 70+ years ago… Jezze, that’s a h_ll of a long revolution, ugh?

  6. Gilgamesh

    IT will grow in direct proportion with the service industry which will eventually surpass manufacturing as America’s prime employer and producer of US GDP.

  7. I’ve owned a personal computer since 1981. I have an M.S. in Computer Science and work in this field. There is absolutely no way the computer revolution is over. We are only at the beginning. The room for growth is incredible. Most manufacturing jobs will be taken over by robots within 20 years. Almost all cars will be self-driving shortly thereafter. Computers will be attached to our bodies and integrated into our brains. We are only scratching the surface. The future is both fascinating and scary.

  8. Enabling technologies that spawn entire industries are by nature unpredictable.
    Never underestimate the catalytic reaction between a young, prepared mind, persistence, and luck.
    And, by all means, avoid extrapolating existing trends into the indefinite future.
    Technology does not evolve linearly. It evolves in unanticipated jumps.

    1. GogogoStopSTOP

      Well said. Technology is sort of quantum in nature. There are thousands of ideas, thousands of patents, but it’s what a larger collective population does when clever new users say, “Wow, look what I can do with this!” That’s when there’s a quantum change & the world changes. Revolutions in technology are not predictable, but you know it when there’s exponential adoption & growth.

  9. I own a computer repair and network service company. I can tell you, firsthand, that the price of technology is as low as it can get which is why you do not see it slowing further.

    At a certain pricepoint, it is simply not worth building. I turn people away when they ask us to build custom computers or servers; to compete with off the shelf stuff is totally unprofitable.

    The downside to this is that pretty much everything you pick up off the shelf at Best Buy is a piece of junk that will not last 2 years without some kind of malfunction. The race to the bottom has hit bottom. There is nowhere left to go.

    1. Akatsukami

      “The downside to this is that pretty much everything you pick up off the shelf at Best Buy is a piece of junk that will not last 2 years without some kind of malfunction.”

      Which is the corollary to Moore’s Law: if it’s going to be obsolete in two years, why build it to last any longer?

  10. Which makes me wonder, is the Industrial Revolution already over? Will things go back to how they were in the 1700s.

    i.e. it’s absurdist crap of a question.

  11. “take whatever policy steps we can to hit the gas pedal.”

    that statement makes my hair stand up in fear … government policies have caused the problems and certainly won’t solve the current ones …

  12. Easy to imagine devices will cause another explosion in horsepower requirements – hologram displays (and pin-point audio to go with it), air sensors that are monitoring you and your family’s health profiles, tax-buddy systems that keep refining your daily transactions to leverage the multi-gazzilion page tax code to your advantage… I think the constant / increasing load of the welfare state and statism in general repel investments.

  13. Those charts suggest that the diminishing percentage of IT price declines tracks rather well to both Dubya’s weak dollar policy and the easy money regime under Ben Bernanke. The other thing it suggests is that we’ve already extracted the easy value out of IT outsourcing to India and electronics manufacturing to China; future value will need to be realized through more difficult and innovative methods.

  14. I design chips using those bleeding edge technologies, so I can offer some guidance.

    Yes, the revolution is slowing. It will soon break. We’re hitting physical limits.

    1) For a 16nm FinFET, you’re talking about a transistor about 160 atoms wide, and about 15 atoms thick. That dimension used to scale by about 70% per generation in the old version of Moore’s Law. Even today we’re hitting a fair number of quantum mechanical effects, and by the time we’re at 5nm in a few years devices will be ruled by quantum mechanical effects and let’s just say computations will get interesting there. At some point “1″ won’t mean “1″.

    2) Moore’s Law talks about density. He doesn’t talk about the speed of the devices. That’s flatlining and has been since about 65nm. At that point the device scaling rules completely broke and we’ve been making tradeoffs. Bascially, we’ve been holding the transistor threshold voltage constant while we scale the devices. That’s because if we don’t do that, the chips will “leak” too much charge and that wasted power will easily match or exceed the useful switching energy by the time we hit 28nm. But that means that device speeds have to go down. FinFETs help to some extent, but not enough to keep frequency scaling like it used to be.

    So, the short version is: yes, chips are slowing down. There are fundamental, physical effects coming into play and essentially the easy gains of the last five decades are over. And unless there’s a radical (and I mean really radical) innovation, your grandkids won’t remember the days of exponential growth in compute power.

    But just because chips are slowing down and costs are exploding doesn’t mean that the computer revolution is over. I’d argue the opposite. The hardware guys have been blowing ahead so fast that the software guys haven’t been able to keep up.

    What’s happening now is that compute power is so cheap that the software guys have just begun to see what they can do with the power out there. We’ve just started to automate things, sensors are just starting to be really deployed, and networking is really just starting. Remember, nearly all the gains in your car’s fuel efficiency since the 70s has been due to electronics, and that’s not even begun to slow yet.

    The chip revolution will stall in maybe a decade or two, but the software/computer revolution has much, much longer than that to run.

    1. GogogoStopSTOP

      Hey Nerdbert, I really enjoyed your post. Your view sounded learned.

      I remember single transistor “integrated” circuits, I recall the T2L versus current switch argument & I remember the bipolar versus FET transition where density started to trumped speed. When density of on-chip performance ‘slow’ speed FET’s clearly outpaced the circuitry that included too many off-chip transitions… Moore’s Law BECAME valid… & the race was on!

      We in the hardware Mainframe business always chortled the software engineers, ahem… Programmers, for always being late!!! ROTFLMAO! Personally, I agree with you however, software has a few years to make the High Life continue for another generation!

    2. to Nerdbert–your 1/13/13 post on AEIdeas is “Spot On” as to the near future in IT and Computing–I tried to print out your thoughts, to share with my partner in our start-up, as well as some other folk, but the AEIdeas site wouldn’t cooperate, and allow me to do so. Can you email me direct a copy of your comments, so I can share them with the above?? My address is: [email protected] Your comments made more sense to me than anyone else’s on the site. Thanks in advance if your able to do this. Kent Pearce

  15. The increasing numbers and types of software patents is also stifling the industry. Unlike manufacturing and physical engineering, there are only so many variations on how to make a computer perform tasks, and if you keep patenting those limited variations, you soon get into the problem of no longer being able to innovate, and you start running into what we currently face, increasing “patent wars.”

    Mathematical formulas were once not able to be patented, and software from the beginning was treated like mathematical equations (which in terms of Boolean algebra they should still be) which prevented software from being patented. Companies in the 80s didn’t like the limited and insufficient protection that copyright afforded, and so they pushed for patenting software. I remember when AT&T patented the “exclusive OR (XOR)” method of covering and exposing windows on a screen, this stifled the ability of the open source “X Window” package to implement window exposing and hiding, which all windowing systems use.

    Softare patents are far more insidious than manufacturing patents, since the product of software development is intangible and “virtual”, and because it really is a way of thinking, any time that a developer determines a process of manipulating certain bits to achieve a certain different set of bits, the developer is forced to constantly perform a patent search to see if anyone else has thought of this bit processing method.

    There are many patentable ways of making iron, but there is only one way of performing an XOR operation on bits, and I would like to see reform of the patent process to either eliminate software patents altogether, or at least limit them to one or two years at most.

  16. Carl Pham

    Yep, it’s over. In fact, the acceleration phase has been over for about 20 years now, and we’ve just been coasting up. It was obvious in the 90s, when you could buy on your desktop stuff that wasn’t seriously different from what was being used in supercomputing centers.

    I guess veryone thought that was because there was some newly zippy conduit from the leading research edge to the commodity user. Nonsense. It’s because the leading research edge in computing hardware crapped out in the 80s. And for that matter, so did system-level software, major computing paradigms. No one has invented anything nearly as remarkable as what was invented in the 70s and 80s since then. It’s all just been refining, polishing, reducing the size, adding bells and whistles, filling in the obvious extensions. You can coast quite a long time on that — but not forever.

    This is just the way technology goes. There was an absolutely explosion of automobile tech in the 1890s through maybe 1950s, but since then, it’s just been refining and polishing up. Same with airplanes from the 1900s through 1970s. You pick all the low-hanging fruit and then…things get very tough, and advances slow to a one per generation crawl.

    The only people truly shocked are going to be those who naively just made a simple extrapolation of computing tech in the 90s and 00s and predict by the 2020s and 2030s the Singularity will have arrived and we’ll all be living in virtual reality, uploading our consciousnesses, talking to our phones in natural language, et cetera. Nope. This is the same mistake made by people who made a straight-line extrapolation of aerospace progress in the 1940s-1960s and predicted colonies on Mars by 2015.

  17. No one else seems to have mentioned it, but I’d also be curious to see the charts with Apple’s insane margins removed from the data. I’m sure the slow economy accounts for reduced R&D expenditures, but aside from the sharp uptick in prices following the collapse of the dot-com bubble there only seems to be about a 3% shift, which captures Apple’s growth to 10 or (claimed) 20% market share along with their outsized margins.To the extent that matters, the market isn’t failing but simply incorporating people’s willingness to pay more for improved user-friendliness, product design, and social status, as opposed to the raw computing power being measured here.

  18. I can’t edit this into my last comment for some reason (poor comment software) but, yes, the ability of Apple to thus far monetize its improvements in user-friendliness and design are in great part attributable to the current rentseeker-captured regime of IP law.

  19. williamwarbler

    The IT computer revolution is most definitely not over. It is going in a different direction.
    The ‘Commercial Computing Revolution’ era, for me started in 1959 when I worked on the first commercial computer designed and built in England. Time has ended this era. Commercial computing is has well known practices. When enterprises used this technology they had to make tranformative organizational changes to reduce labour costs and so maximize their benefits to stay competitive. Economists created ways to identify and measure these changes.
    We are now in what I call the ‘Societal/Mobility Revolution’ era. It will continue to create thousands of new IT jobs but I think economists and others will need to create new ways to measure its impact since they will not necessarily be in our enterprises.
    The coincidental convergence of several different areas of technology results in this change in direction for the IT revolution and its subsequent impacts.

Comments are closed.

Sort By:

Refine Content:

Scholar

Additional Keywords:

Refine Results

or to save searches.

Open
Refine Content