Computers Reduce Efficiency: Case Studies of the Solow Paradox (2023)

https://scottlocklin.wordpress.com/2023/11/21/computers-reduce-efficiency-case-studies-of-the-solow-paradox/

I’ve harped on this in my sidewinder and slide rule blergs, as well as older ones: very often, using a computard is a recipe for failure, and old fashioned techniques are more efficient and useful. The sidewinder guys at China Lake actually called this out in their accounts of the place going to seed: computers and carpets ruined the place’s productivity and most of all innovation. I’ve mentioned my own anecdotes of CAD failures, and the general failings of computers in a design cycle. This is an early realization of mine; even before the internet existed as a distraction. In my first serious programming projects I had a lot of good experiences literally writing out the Fortran on a giant brown paper trashbag cut up and stapled into a scroll-like object, compared to people who would laboriously use WATCOM or Emacs or whatever people were using in those days as an IDE, looking through the toilet paper tube of 640×480 90s era monitors. I attributed  this to simply being able to look at the whole thing at a glance, but for all I know, writing it with a pencil and engaging my hand’s proprioceptors, or not looking at a hypnotically flickering screen were the magic ingredients. I’m beginning to think the latter things are more important than people realize.

RAND corporation did a study of the failings of CAD in design of British nuclear submarines. Before computard/CAD tools, people would use the old timey techniques of drafting in 2-d on a piece of paper, and building scale models to see how pieces fit together. Literally laboriously using weird T-square tools and a pencil and building plaster models was faster than using advanced computard technology. Again, this is something I’ve actually experienced in building experimental physics gizmos. You can spend months on a design in Solidworks and make something impossible to fabricate which doesn’t line up with the rest of the system: I’ve seen it happen. Dude with a tape measure can fix it in a few hours if it’s a one-off; somehow these problems don’t come up with models and slide-rule and paper design. This was admitted in Parliament in their investigations of the cost overruns on their Astute class submarines. It boggles my mind that people still don’t realize this is a real problem. We get mindless repetitions that “software is eating everything” like some kind of mantra despite evidence to the contrary. Instead of studying the problem, it’s simply dismissed. Nobody trains in non-CAD drafting any more so we can’t exactly go back to that.

Now the RAND study did sort of elide over the core problem by stating that American expertise (who had been using CAD and run into many of the problems before) at the electric boat company helped unfuck the British program. They did not ask the basic question of whether or not CAD was mostly harmful; it might be so that its use reduces productivity overall and people might be better off only using it strategically. We’ll never actually know, because unless the Russians are still using old timey drafting methods, we don’t have a comparison class which isn’t time-censored (the Chinese would never think of this: using paper would be seen as losing face).

Another study is one on CAD design back in 1989. He uses the example of printed circuit design; something that has long since been given over to CAD. Back in those days a lot of the designs had to be refactored by hand. He also notes the danger that future generations of designers might have atrophied skills which won’t enable him to do this. He notes that CAD didn’t eliminate the job of the draftsman or increase his output; he just does it on a computer now.

For another example, Richard H Franke studied an early adopter of now widespread computer technologies: the financial services industry. This is wholly remarkable because if any field would show an increase in productivity due to computer adoption it would be financial services, but he pretty definitively proved, up to 1987 anyway, productivity of financial services went down due to the introduction of computers. Not by a little bit either: by a lot:

Note traditional non-computard plot he made: probably got published 2 years earlier because of this

Note in the same paper he found that introduction of CAM in manufacturing was also associated with a similar productivity decline. You can sort of imagine why: computer equipment was expensive and people had to learn how to use it. But there are probably larger scale effects. I have small machine tools in my house; none of them are CAM tools. If I want a part, in most cases it’s dirt simple to get out the calipers, visualize it in my mind and make the damn thing. At most I need to fool around with a ruler and piece of graph paper. I can’t make everything this way, and there are a number of doodads I’d have a hard time with, where a $50,000 CAM mill that fills my entire machine shop would be able to do it. The CAM thing would do the corner cases, but I’d spend months learning how to use the thing, spend tons of loot keeping it running (it’s much more complicated and prone to failure), and I’d spend all my time ministering to this monstrosity, learning to use whatever CAD tools are out there, and forgetting how to make precise cuts on my manual mill and lathe. The same story was probably true in FinServ. Their routine tasks were made more complicated by ritual obeisances to the computer gods.

Somewhat to my surprise there are enough examples of this that economists have actually come up with a name for it. It’s called the Solow Paradox. Robert Solow is a 99 year old MIT emeritus professor of economics who quipped in 1987 that “You can see the computer age everywhere but in the productivity statistics.” I loathe economists as a pack of witch doctors with linear regression models, but the effect is large enough even they noticed. Everyone was relieved when GUIs and LANs came out in the mid 90s and these technologies did seem to be associated with an increase in productivity in some sectors of the economy. This measurable increase basically stopped when people started wiring their computers up to the internet. It’s not like MS Word does anything different now that it didn’t do in 1995. It just requires more resources to run.

For brick and mortar retail one can understand how productivity increased in the 90s. It’s a hell of a lot easier using bar codes and a database back end to manage your inventory than whatever ad hoc filing cabinet systems people were using before. With inventory control you can optimize your supply chain and get further efficiencies. Buying it all from China also helped the firms involved in doing this (didn’t do any good for the country’s manufacturing capabilities of course, but that’s out of scope for economists). This process was happening in the 80s, but computers were still running things like DOS with VAX and AS/400 backends; all of which required ministrations of a large caste of IT professionals. Hooking everything up to a LAN with GUI front ends helped lower the IT head count, so the IT guys could go off and invent new businesses involving wasting time on the internet.  Later you got some productivity growth from selling stuff online instead of in shops (which are a large cost center). BTW this is my interpretation for why the Solow paradox came back in the 00’s: the most obvious interpretation.

You can see here the “paradox” that computers aren’t helpful is back

earlier figures in productivity growth not shown in current year BLS website

Just to tangentially remind people: this is the promise of computing and wistful fantasies like “AI” -you want to increase the productivity of a worker to the point where you can make do with fewer workers, outputting the same amount of product for cheaper. If you have a technology that doubles a worker’s productivity, but requires another worker to minister to the technology, you haven’t increased your company’s productivity: you’ve decreased it, because you have the same output per worker and an incurred cost for the technology. If you have a technology which marginally increases a worker’s productivity but still requires another worker to minister to the technology, you have made productivity significantly lower.  It is entirely possible that you might lower a worker’s productivity with a technology, as we saw with the British attempt to cut submarine design costs using CAD instead of using t-squares, pieces of graph paper and styrofoam models.

The economists, naturally, are lowering their own productivity by arguing about this; some of them claim it’s not real and that the productivity increases show up at some nebulous later date in an unspecified way that apparently can’t be measured. Some of them simply flip over the tables and insist the computers are good and we should find a new way to measure productivity that involves fucking around on a computer. This despite the abundant evidence productivity is slowing down or declining despite our pervasive computard technology. They fiddle around with linear regression models of varying degrees of sophistication. They argue on the internet. What they don’t do is look for situations where the data show differences in an attempt to understand well enough to provide guidance or solutions. This is a microcosm of everything else: rather than solving a problem, they’re looking busy by furiously typing on their computard. Economists in the pre-computard era were more than capable of this sort of thing: Burton Klein made a good stab at looking at productivity improvements using pencil and graph paper.

One of the things that introducing a new technology does do: it redistributes resources and what people do on a daily basis. I don’t know if Avon Ladies are still a thing; but now there are Instagram whores shilling things. Database vendors and DBAs get money instead of filing cabinet manufacturers and filing clerks. Instead of filling out paper forms, people fill out computer forms. For another example, computard made decimalization a thing, and opened up market making to people all over the world, instead of a couple of knuckle dragging former football players in Chicongo and NYC who went to the same couple of high schools. Any individual who buys stonks now (and a lot more people do) will get a better price. Lots more guys like me get paid. Still, the total productivity has gone down. Is it better to pay a couple of dumb incumbents more money or more Ph.D. types less money and spend the difference on computers and stratum 0 NTP servers instead of coke and hookers?

Robotic automation may remove jobs from blue collar workers and assign more jobs to white collar workers and the pyramid-scheme institutions which certify white collar workers. It would be hilarious if we automated all the manufacturing jobs with robotics and it lowered productivity: that actually seems to be the trend. The long term trend of this is that lower IQ people have nothing remunerative to do, and higher IQ people in these jobs don’t reproduce, because they educated themselves out their fertility windows. That’s another issue nobody in economics wants to think about, but an Amish farmer would probably notice.

Mind you I think robotics is something I think is worth investing R&D dollars in. All these “AI” goons fooling around with LLMs or larpy autonomous vehicle nonsense should be working on workaday stuff like depth estimation, affordance discovery and scene understanding, or other open problems in robotics. It’s the mindless application of current year information technologies in areas they are not suited for or not helpful at all I find disagreeable. We add computers to things not because it makes things better, but as a sort of religious ritual to propitiate technological gods. The gods are not pleased with our sacrifices. We do them anyway, like the cargo cult guy with coconut earphones trying a different variety of coconut in hopes of getting a different answer.

The persistent presence of the Solow “paradox” ought to give pause over how we develop and innovate new technologies. If I visit a company claiming to innovate things, is there a computer on everyone’s desk? Does there need to be a computer there? What are people doing at their computers? Is it mission oriented or are they just fucking around with a computer? I suspect banning computers in R&D facilities excepting where absolutely necessary would pay dividends. Banish them to special compute rooms, and limit employee time there.  Someone should try it; there’s nothing to lose: all R&D is a gamble, and at least you won’t waste time fiddling with computers.

{
"by": "gtt",
"descendants": 87,
"id": 40233938,
"kids": [
40264058,
40264453,
40264386,
40264530,
40264335,
40266291,
40263288,
40263248,
40263451,
40265385,
40263136,
40265842,
40265450,
40264150,
40265753,
40264689,
40265405,
40263811,
40264333,
40267872,
40264302,
40263218,
40265188,
40264546
],
"score": 97,
"time": 1714638491,
"title": "Computers Reduce Efficiency: Case Studies of the Solow Paradox (2023)",
"type": "story",
"url": "https://scottlocklin.wordpress.com/2023/11/21/computers-reduce-efficiency-case-studies-of-the-solow-paradox/"
}
{
"author": "4:25 pm",
"date": "2023-11-22T11:13:20.000Z",
"description": "I’ve harped on this in my sidewinder and slide rule blergs, as well as older ones: very often, using a computard is a recipe for failure, and old fashioned techniques are more efficient and u…",
"image": "https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/finserve.jpg?w=300",
"logo": "https://logo.clearbit.com/wordpress.com",
"publisher": "WordPress.com",
"title": "Computers reduce efficiency: Case Studies of the Solow Paradox",
"url": "https://scottlocklin.wordpress.com/2023/11/21/computers-reduce-efficiency-case-studies-of-the-solow-paradox/"
}
{
"url": "https://scottlocklin.wordpress.com/2023/11/21/computers-reduce-efficiency-case-studies-of-the-solow-paradox/",
"title": "Computers reduce efficiency: Case Studies of the Solow Paradox",
"description": "I've harped on this in my sidewinder and slide rule blergs, as well as older ones: very often, using a computard is a recipe for failure, and old fashioned techniques are more efficient and useful. The sidewinder guys at China Lake actually called this out in their accounts of the place going to seed: computers…",
"links": [
"https://scottlocklin.wordpress.com/2023/11/21/computers-reduce-efficiency-case-studies-of-the-solow-paradox/",
"https://wp.me/pt6wy-1Af"
],
"image": "https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/finserve.jpg?w=640",
"content": "<div>\n\t\t<p>I’ve harped on this in my <a target=\"_blank\" href=\"https://scottlocklin.wordpress.com/2021/03/22/planning-of-invention-2-sidewinder-and-china-lake/\">sidewinder</a> and <a target=\"_blank\" href=\"https://scottlocklin.wordpress.com/2021/06/20/why-everyone-should-learn-the-slide-rule/\">slide rule</a> blergs, as well as <a target=\"_blank\" href=\"https://scottlocklin.wordpress.com/2011/01/18/how-hackers-ruin-everything-with-computers/\">older ones:</a> very often, using a computard is a recipe for failure, and old fashioned techniques are more efficient and useful. The sidewinder guys at China Lake actually called this out in their accounts of the place going to seed: computers and carpets ruined the place’s productivity and most of all <em>innovation</em>. I’ve mentioned my own anecdotes of CAD failures, and the general failings of computers in a design cycle. This is an early realization of mine; even before the internet existed as a distraction. In my first serious programming projects I had a lot of good experiences literally writing out the Fortran on a giant brown paper trashbag cut up and stapled into a scroll-like object, compared to people who would laboriously use WATCOM or Emacs or whatever people were using in those days as an IDE, looking through the toilet paper tube of 640×480 90s era monitors. I attributed  this to simply being able to look at the whole thing at a glance, but for all I know, writing it with a pencil and engaging my hand’s proprioceptors, or not looking at a hypnotically flickering screen were the magic ingredients. I’m beginning to think the latter things are more important than people realize.</p>\n<p>RAND corporation did a <a target=\"_blank\" href=\"https://www.rand.org/content/dam/rand/pubs/monographs/2011/RAND_MG1128.3.pdf\">study</a> of the failings of CAD in design of British nuclear submarines. Before computard/CAD tools, people would use the old timey techniques of <a target=\"_blank\" href=\"https://rarehistoricalphotos.com/life-before-autocad-1950-1980/\">drafting</a> in 2-d on a piece of paper, and building scale models to see how pieces fit together. Literally laboriously using weird T-square tools and a pencil and building plaster models was faster than using advanced computard technology. Again, this is something I’ve actually experienced in building experimental physics gizmos. You can spend months on a design in Solidworks and make something impossible to fabricate which doesn’t line up with the rest of the system: I’ve seen it happen. Dude with a tape measure can fix it in a few hours if it’s a one-off; somehow these problems don’t come up with models and slide-rule and paper design. This was admitted in <a target=\"_blank\" href=\"https://publications.parliament.uk/pa/cm200506/cmhansrd/vo060309/text/60309w16.htm\">Parliament</a> in their investigations of the cost overruns on their Astute class submarines. It boggles my mind that people still don’t realize this is a real problem. We get mindless repetitions that “software is eating everything” like some kind of mantra despite evidence to the contrary. Instead of studying the problem, it’s simply <em>dismissed</em>. Nobody trains in non-CAD drafting any more so we can’t exactly go back to that.</p>\n<p>Now the RAND study did sort of elide over the core problem by stating that American expertise (who had been using CAD and run into many of the problems before) at the electric boat company helped unfuck the British program. They did not ask the basic question of whether or not CAD was mostly harmful; it might be so that its use reduces productivity overall and people might be better off only using it strategically. We’ll never actually know, because unless the Russians are still using old timey drafting methods, we don’t have a comparison class which isn’t time-censored (the Chinese would never think of this: using paper would be seen as losing face).</p>\n<p>Another study is one on <a target=\"_blank\" href=\"https://ieeexplore.ieee.org/abstract/document/40978\">CAD design</a> back in 1989. He uses the example of printed circuit design; something that has long since been given over to CAD. Back in those days a lot of the designs had to be refactored by hand. He also notes the danger that future generations of designers might have atrophied skills which won’t enable him to do this. He notes that CAD didn’t eliminate the job of the draftsman or increase his output; he just does it on a computer now.</p>\n<p>For another example, <a target=\"_blank\" href=\"https://www.sciencedirect.com/science/article/abs/pii/0040162587900461\">Richard H Franke</a> studied an early adopter of now widespread computer technologies: the financial services industry. This is wholly remarkable because if any field would show an increase in productivity due to computer adoption it would be financial services, but he pretty definitively proved, up to 1987 anyway, productivity of financial services went down due to the introduction of computers. Not by a little bit either: by a lot:</p>\n<div><p><img src=\"https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/finserve.jpg?w=460&amp;h=460\" srcset=\"https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/finserve.jpg?w=300 300w, https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/finserve.jpg?w=150 150w\" /></p><p>Note traditional non-computard plot he made: probably got published 2 years earlier because of this</p></div>\n<p>Note in the same paper he found that introduction of CAM in manufacturing was also associated with a similar productivity decline. You can sort of imagine why: computer equipment was expensive and people had to learn how to use it. But there are probably larger scale effects. I have small machine tools in my house; none of them are CAM tools. If I want a part, in most cases it’s dirt simple to get out the calipers, visualize it in my mind and make the damn thing. At most I need to fool around with a ruler and piece of graph paper. I can’t make everything this way, and there are a number of doodads I’d have a hard time with, where a $50,000 CAM mill that fills my entire machine shop would be able to do it. The CAM thing would do the corner cases, but I’d spend months learning how to use the thing, spend tons of loot keeping it running (it’s much more complicated and prone to failure), and I’d spend all my time ministering to this monstrosity, learning to use whatever CAD tools are out there, and forgetting how to make precise cuts on my manual mill and lathe. The same story was probably true in FinServ. Their routine tasks were made more complicated by ritual obeisances to the computer gods.</p>\n<p>Somewhat to my surprise there are enough examples of this that economists have actually come up with a name for it. It’s called the <a target=\"_blank\" href=\"https://en.wikipedia.org/wiki/Productivity_paradox\">Solow Paradox</a>. Robert Solow is a 99 year old MIT emeritus professor of economics who quipped in 1987 that <em>“You can see the computer age everywhere but in the productivity statistics.” </em>I loathe economists as a pack of witch doctors with linear regression models, but the effect is large enough <em>even they</em> noticed. Everyone was relieved when GUIs and LANs came out in the mid 90s and these technologies did seem to be associated with an increase in productivity in some sectors of the economy. This measurable increase basically <a target=\"_blank\" href=\"https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/is-the-solow-paradox-back\">stopped</a> when people started wiring their computers up to the internet. It’s not like MS Word does anything different now that it didn’t do in 1995. It just requires more resources to run.</p>\n<p>For brick and mortar retail one can understand how productivity increased in the 90s. It’s a hell of a lot easier using bar codes and a database back end to manage your inventory than whatever ad hoc filing cabinet systems people were using before. With inventory control you can optimize your supply chain and get further efficiencies. Buying it all from China also helped the firms involved in doing this (didn’t do any good for the country’s manufacturing capabilities of course, but that’s out of scope for economists). This process was happening in the 80s, but computers were still running things like DOS with VAX and <a target=\"_blank\" href=\"https://en.wikipedia.org/wiki/IBM_AS/400\">AS/400</a> backends; all of which required ministrations of a large caste of IT professionals. Hooking everything up to a LAN with GUI front ends helped lower the IT head count, so the IT guys could go off and invent new businesses involving wasting time on the internet.  Later you got some productivity growth from selling stuff online instead of in shops (which are a large cost center). BTW this is my interpretation for why the Solow paradox came back in the 00’s: the most obvious interpretation.</p>\n<div><p><img src=\"https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/prod-recent.jpg?w=300&amp;h=216\" srcset=\"https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/prod-recent.jpg?w=300 300w, https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/prod-recent.jpg?w=600 600w, https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/prod-recent.jpg?w=150 150w\" /></p><p>You can see here the “paradox” that computers aren’t helpful is back</p></div>\n<div><p><img src=\"https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/prodpast.jpg?w=300&amp;h=242\" srcset=\"https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/prodpast.jpg?w=300 300w, https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/prodpast.jpg?w=600 600w, https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/prodpast.jpg?w=150 150w\" /></p><p>earlier figures in productivity growth not shown in current year BLS website</p></div>\n<p>Just to tangentially remind people: this is the promise of computing and wistful fantasies like “<a target=\"_blank\" href=\"http://wp.circle.lu.se/upload/CIRCLE/workingpapers/202201_baeck.pdf\">AI</a>” -you want to increase the productivity of a worker to the point where you can make do with fewer workers, outputting the same amount of product for cheaper. If you have a technology that doubles a worker’s productivity, but requires another worker to minister to the technology, you haven’t increased your company’s productivity: you’ve decreased it, because you have the same output per worker and an incurred cost for the technology. If you have a technology which marginally increases a worker’s productivity but still requires another worker to minister to the technology, you have made productivity significantly lower.  It is entirely possible that you might <em>lower</em> a worker’s productivity with a technology, as we saw with the British attempt to cut submarine design costs using CAD instead of using t-squares, pieces of graph paper and styrofoam models.</p>\n<p>The economists, naturally, are lowering their own productivity by arguing about this; some of them claim it’s not real and that the productivity increases show up at some nebulous later date in an unspecified way that apparently can’t be measured. Some of them simply flip over the tables and insist the computers are good and we should find a new way to measure productivity that involves fucking around on a computer. This despite the abundant evidence productivity is slowing down or declining despite our pervasive computard technology. They fiddle around with linear regression models of varying degrees of sophistication. They argue on the internet. What they don’t do is look for situations where the data show differences in an attempt to understand well enough to provide guidance or solutions. This is a microcosm of everything else: rather than solving a problem, they’re looking busy by furiously typing on their computard. Economists in the pre-computard era were more than capable of this sort of thing: <a target=\"_blank\" href=\"https://scottlocklin.wordpress.com/2021/02/17/planning-of-invention-part-1-burton-klein-and-dynamic-economics/\">Burton Klein</a> made a good stab at looking at productivity improvements using pencil and graph paper.</p>\n<p>One of the things that introducing a new technology <em>does</em> do: it redistributes resources and what people do on a daily basis. I don’t know if <a target=\"_blank\" href=\"https://nostalgiacentral.com/pop-culture/fads/avon-ladies/\">Avon Ladies</a> are still a thing; but now there are Instagram whores shilling things. Database vendors and DBAs get money instead of filing cabinet manufacturers and filing clerks. Instead of filling out paper forms, people fill out computer forms. For another example, computard made decimalization a thing, and opened up market making to people all over the world, instead of a couple of knuckle dragging former football players in Chicongo and NYC who went to the same couple of high schools. Any individual who buys stonks now (and a lot more people do) will get a better price. Lots more guys like me get paid. Still, the total productivity has gone down. Is it better to pay a couple of dumb incumbents more money or more Ph.D. types less money and spend the difference on computers and stratum 0 NTP servers instead of coke and hookers?</p>\n<p>Robotic automation may remove jobs from blue collar workers and assign more jobs to white collar workers and the pyramid-scheme <a target=\"_blank\" href=\"https://scottlocklin.wordpress.com/2022/02/19/managerial-failings-complification/\">institutions</a> which certify white collar workers. It would be hilarious if we automated all the manufacturing jobs with robotics and it lowered productivity: that <a target=\"_blank\" href=\"https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3978800\">actually</a> <a target=\"_blank\" href=\"https://www.sciencedirect.com/science/article/pii/S0954349X22001394\">seems</a> to be <a target=\"_blank\" href=\"https://www.econstor.eu/handle/10419/231332\">the trend</a>. The long term trend of this is that lower IQ people have nothing remunerative to do, and higher IQ people in these jobs don’t reproduce, because they educated themselves out their fertility windows. That’s another issue nobody in economics wants to think about, but an Amish farmer would probably notice.</p>\n<p>Mind you I think robotics is something I think is worth investing R&amp;D dollars in. All these “AI” goons fooling around with LLMs or larpy autonomous vehicle nonsense should be working on workaday stuff like depth estimation, affordance discovery and scene understanding, or other <a target=\"_blank\" href=\"https://scottlocklin.wordpress.com/2020/07/29/open-problems-in-robotics/\">open problems</a> in robotics. It’s the mindless application of current year information technologies in areas they are not suited for or not helpful at all I find disagreeable. We add computers to things not because it makes things <a target=\"_blank\" href=\"https://www.goodhousekeeping.com/appliances/refrigerator-reviews/g39784846/smart-refrigerators/\">better</a>, but as a sort of religious ritual to propitiate technological gods. The gods are not pleased with our sacrifices. We do them anyway, like the cargo cult guy with coconut earphones trying a different variety of coconut in hopes of getting a different answer.</p>\n<p>The persistent presence of the Solow “paradox” ought to give pause over how we develop and innovate new technologies. If I visit a company claiming to innovate things, is there a computer on everyone’s desk? Does there need to be a computer there? What are people doing at their computers? Is it mission oriented or are they just fucking around with a computer? I suspect banning computers in R&amp;D facilities excepting where absolutely necessary would pay dividends. Banish them to special compute rooms, and limit employee time there.  Someone should try it; there’s nothing to lose: all R&amp;D is a gamble, and at least you won’t waste time fiddling with computers.</p>\n<p><img src=\"https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/jihad.jpg?w=213&amp;h=300\" srcset=\"https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/jihad.jpg?w=213 213w, https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/jihad.jpg?w=426 426w, https://scottlocklin.wordpress.com/wp-content/uploads/2023/11/jihad.jpg?w=106 106w\" /></p>\n\t</div>",
"author": "",
"favicon": "https://secure.gravatar.com/blavatar/3301895bbc98ef1ac39ca90dff26745d85190fa31b7626ac0b02a483a00c8ace?s=32",
"source": "scottlocklin.wordpress.com",
"published": "2023-11-22T11:13:20+00:00",
"ttr": 429,
"type": "article"
}