insights

AI Economics 101

share

The advent of a new age for artificial intelligence raises many broad questions for the future of our economy. But (attempting to) answering them head-first would be ill-advised: before diving into issues such as employment or inequality, it is necessary to understand the main concepts that economists use to shape their scientific controversies. You need to speak “economese” to assess the impact of AI, and here is a conversation guide.

In your reading, you may be surprised to find 0 occurrences of the words “artificial intelligence” or related terms. The explanation is that the current boom in AI is too recent to have yielded brand-new concepts or schools of thought. For now, mastering AI economics means leveraging terms derived from the economic analysis of the Industrial Revolution and of the digital age.

If you’d like to see another economic notion defined, send me an email and I’ll update this article accordingly. :-)

 

Nominal / real terms: economists draw a distinction between nominal terms and real terms. Nominal terms are counted in current prices whereas real terms are counted in constant prices. The difference is of tremendous importance because real terms “erase” the impact of inflation.

Let’s take an example: if in 2016 your wage increased by 10%, was it a good thing? Well, it really depends on the parallel increase in prices. If inflation was higher than 10%, it means that your real wage decreased, thus you had a net decrease in purchasing power. If inflation was lower than 10%, its means that your real wage increased, thus you had a net increase in purchasing power.

The study of the evolution of indicators in real terms allows economists to assess the real impact of technological development.

But it is far from being the only relevant figure to grasp the state of the labor market. There are 2 other important indicators:

  • Labor participation rate = total labor force / working-age population (i.e. generally the population aged 15 to 64, but you can compile statistics based on other age groups).
  • Employment rate = number of workers holding a job (NB: no unemployed people included!) / working-age population

Why is this distinction critical? Because it is not impossible that for instance the unemployment rate increases while the employment rate increases at the same time! This situation may arise if the number of jobs increases more rapidly than the population, but less than the labor force.

 

Mean / median: both mean and median are used to compute the “central” value of a dataset.

The mean (or average) of a variable is the sum of the measured values divided by the number of measures. A simple example is worth a thousand words: if 5 co-workers have salaries of $30, $35, $40, $45 and $50 respectively, the mean salary is then: (30 + 35 + 40 + 45 + 50) / 5 = $40

The median is the value that divides the population in two halves of equal size. In our simple example, there are 5 values, so the median is the third value (when measures are sorted in increasing order): that is, $40.

Note that the mean and median of a statistical population are not always equal! Let’s add a new co-worker, earning a $100 salary. The mean becomes (30 + 35 + 40 + 45 + 50 + 100) / 6 = $50, whereas the median is now $42.5 (if there is an even number of observations, take the average of the two central values to compute it).

We see that the median is a more robust measure because it is less impacted by outliers - but it is not always possible to calculate it, since you need the whole ranking of the observations.

Measures of inequality generally use the median as an indicator of choice.

 

Quartiles / deciles / percentiles: quartiles, deciles and percentiles refer to the values of observations that separate a dataset into respectively 4, 10 and 100 parts of equal size. They are always sorted in the same way: for instance, regarding the distribution of wages in a given population, the higher the decile, the richer you are.

Similar to the median, quartiles, deciles and percentiles are used to gauge the level of inequalities. For instance, to assess the distribution of wages aforementioned, a common indicator calculated is the interdecile ratio (D9 / D1): the wage above which you belong to the top 10% divided by the wage below which you belong to the bottom 10%. The higher this ratio, the more unequal the distribution.

 

Controlling a variable: economists build models to find significant correlations (and, if possible, causal links) between variables. The dependent variable is the one you want to explain (e.g. productivity per employee in a company), independent variables are the ones you use as potential reasons for the variations of the former (e.g. share of highly-skilled employees, investment in ICT per employee).

When you read in a economic paper “we controlled for variable X”, it simply means that the researchers included X as an independent variable in their model. Why is controlling for variables a major concern to economists? Because you want to avoid your model being “polluted” by other explanatory factors.

Let’s go back to our example: if you want to account for the impact of the skills of a company’s workforce and the importance of ICT in that workplace on the productivity level, you’d probably want to control for salary level. Since there is probably a correlation between salary and productivity, but that’s not what you’re trying to measure, so you must explicitly include it in the model so as to obtain a “purer” estimate of the impact of the other variables, the ones you’re really interested in.

 

Productivity: as the INSEE (French national statistics bureau) explains, productivity is defined as “the ratio, in volume, between production and the resources used to obtain that production”. The 2 main factors generally considered are capital (e.g. machines) and labor (employees’ time spent working).

Labor productivity is often measured by dividing output by the number of hours worked, or by the number of employees. And because an output is measured for a given period, the related capital input should also be measured for the same period: that’s why capital productivity is not obtained by dividing production output by the stock of capital (i.e. value of assets used for production) but by dividing output by the flow of what are called capital services (e.g. rather than using the value of a building as capital input, measuring the equivalent rent you would have to pay or could derive from the building).

An incredibly important notion when assessing the impact of innovation on growth is called total-factor productivity (TFP, or multifactor productivity, MFP): it is the variation of the output that is not accounted for by the variation of labor input nor by the variation of capital input. TFP is thus attributed to the effects of technological or organizational innovations.

Decomposing the GDP growth rate into the separate contributions of the various inputs is called growth accounting.

 

Secular Stagnation: the thesis that in the coming decades it will be impossible to reach the economic growth rates of the middle of the 20th century, using historical growth statistics as a starting point (indeed, the growth rates of the middle 20th century seem to be unattainable today):

Suggested explanatory factors vary: a lot revolve around monetary policy causes, but of interest to us, some experts point to a lesser impact of new technologies on economic growth (e.g. that the effects of the steam engine or electricity were much more significant than the advent of computers and the Internet).

A proponent of the Secular Stagnation thesis, the economist Robert Gordon emphasizes 4 headwinds for the next 25-40 years:

  1. Demographics: in the 2000s, the labor participation rate has started to decline.
  2. Education: the transition of the population towards higher education is complete.
  3. Rising income inequality.
  4. Rising government debt (as ratio over GDP).

What’s more, Gordon downplays the arguments of techno-optimists. In his view, the rate of technological progress won’t be faster than what happened in the last few years, and so technological forces won’t succeed in counterbalancing the 4 major economic headwinds.

 

Productivity paradox: this notion is related to the Secular Stagnation theory. Indeed, we can see it as its harbinger. As Wikipedia perfectly defines it, “the productivity paradox refers to the slowdown in productivity growth in the United States in the 1970s and 80s despite rapid development in the field of information technology (IT) over the same period”.

Some explain the productivity paradox by pointing to the more important productivity gains that were obtained from previous innovations such as electricity. Others contend that traditional economic indicators fail to grasp the benefits that have been derived from the ongoing IT revolution (e.g. the quality of products vs. their sole price).

 

Institutions: the economist Douglas North defines institutions as “the rules of the game in a society or, more formally, are the humanly devised constraints that shape human interaction.” Generally speaking, institutions are thus both (i) encompassing and (ii) soft structures of a given economy. Institutions can be economic (e.g. usual customer-supplier relationships among companies; property rights system), social (e.g. typical family structure) or political (e.g. type of voting system).

Innovation economists carefully study institutions so as to understand which ones are more conducive to innovation and growth. E.g. which system leads to more innovation between strictly-enforced patents in the Silicon Valley on the one hand, and the shanzhai culture of widespread imitation in the electronics mecca of Shenzhen on the other?

 

Externality: “an externality is the cost or benefit that affects a party who did not choose to incur that cost or benefit” (thanks again, Wikipedia); it can be positive or negative. Negative externalities are almost always illustrated with environmental problems, e.g. a factory dumping toxic waste into a river generates negative externalities for the irrigation system or the fishermen downstream. Positive externalities are more related to innovation: e.g. a company developing a software program for an internal use case, then releasing it as an open source project, creates positive externalities for all the other individuals or organizations that will re-use it for free.

 

Frontier firms: in studies about innovation and productivity differences among firms, this term can be misleading. The name or adjective “frontier” is a relative concept refering to the firms that are the most innovative or productive (e.g. the top 10%). Hence it shouldn’t be interpreted as an absolute reference, that is to say firms that would leverage a given set of cutting-edge technologies. Even if, of course, we expect the most productive firms to be more technologically advanced than their competitors.

 

GPT (General-Purpose Technology): a technology with all of these 3 key characteristics:

  1. Pervasiveness – The GPT should spread to most sectors.
  2. Improvement – The GPT should get better over time and, hence, should keep lowering the costs of its users.
  3. Innovation spawning – The GPT should make it easier to invent and produce new products or processes.” (summary by Bojanovic & Rousseau)

Examples of GPTs are the steam engine, the electric motor and computers. There are two main questions around GPTs: (i) which technologies can be classified as such? and (ii) how important are GPTs to economic growth?

 

Tradable / nontradable: tradable goods and services can be produced in one place and consumed in another (e.g. cars) whereas nontradable goods and services have to be produced and consumed all in the same place (e.g. hairdresser). This distinction is critical to assess the impact of globalization on national economies. And regarding innovation, the $bn question is: how much do new technologies contribute to turn nontradable goods and services into tradable ones? (e.g. think about telemedicine).

 

Great Recession: term used by the economists to describe the recession that occurred in the wake of the 2007-08 financial crisis. In the USA, it lasted from December 2007 to June 2009. It is considered the most severe recession since the Great Depression of the 1930s.

 

Human capital: a key theory used by economists to analyze the evolution of the labor market and of labor productivity, it posits that “individuals possess an array of skills and attributes that earn a return in the labor market. Skills are durable, context-dependent [i.e. the value of skills depends on the sector, on the evolution of technologies which can be complements or substitutes], and responsive to investment [i.e. investing in learning leads to higher or additional skills], which happen to be some of the key properties of traditional, physical capital.” (Eric Nielsen)

Differences in human capital are thus used to account for differences in wages or employment levels within the workforce.

 

Skill-Biased Technical Change (SBTC): while economics usually consider technical change as neutral regarding the demand for various skill levels, the SBTC hypothesis points to a transformation of the impact of new technologies over the last few decades: technologies are now highly complementary to skills, then they favor the demand for skilled over unskilled workers (skill being generally measured through educational attainment).

 

Skill premium: this is the difference between the wages of skilled and unskilled workers. It is generally considered that the skill premium has widened over the last few decades, with SBTC being suggested as a major factor.

more...