Technological Change (in my life) part-2

edit: 2021-03-18 (this is a stream-of-consciousness thing)

Epiphany 10: Smart Phones are really Star Trek Tricorders

Epiphany 11: "Moore's Law Revisited" and "the Big Data Revolution"

caveat: The following was partly inspired by the book Thank You for Being Late (2016) by Thomas Freidman

The Chessboard

The Integrated Circuit (chip)

Big Data (explodes in 2007-2008 but no one noticed because of the financial crash of 2008-2009)

Epiphany 12: GPU's and Moore's Law

The original definition of Moore's Law stated that the number of transistors on a single chip would "double every 18 months" while "costing the consumer the same amount". (note: after the first decade Gordon Moore revised the doubling period to every 24 months). I remember reading articles in the mid-1990s claiming that Moore's Law would hit a limit in 2005 or 2006. The main reasons were attributed to manufacturing problems related to "the limits of photo lithography" and "gate sizes so small that electrons would be able to quantum tunnel across barriers thus rendering them conductors rather than semi-conductors"

Around 2004 Intel announced a change in direction away from "faster single CPUs" toward "single multi-core CPUs" for which they charged more money. Technically speaking, they avoided Moore's Limit by revising Moore's Law to not include the phrase "costing the consumer the same amount". So now we have an economic Moore's Law as well as a technical one.

Advances including FinFET technology, Tri-Gate technology, Gate-all-around (GAA) technology, and 3D IC Stacking have enabled the semiconductor industry to keep innovating. Most people reading this page will already be aware of the fact that the computing industry appears to be shifting from CPUs to GPUs. I was surprised to learn that graphics cards from Nvidia have beat the technical Moore's Law fore the last few product iterations.

NVidia science cards (graphics cards without a video connector) break Moore's Law every year

Year Product Gate Size
2006-2007 Tesla 1.0
2008-2009 Tesla 2.0
90, 80, 65, 55, and 40
2010 40, and 28
2012 28
2014 28
2016 16, and 14

Epiphany 13: The Second Renaissance (Storm before the Calm?)

The Renaissance was a period in European history, from the 14th to the 17th century, regarded as the cultural bridge between the Middle Ages and our modern world. It needs to be mentioned that most people at the time did not know they were in a renaissance.
  1. Before Johannes Gutenberg introduced printing to Europe in 1439, it is safe to say that Europe was effectively run as a Catholic theocracy centered in Rome. But Gutenberg's technology enabled an alternative view of Christianity to flourish which we now refer to as the protestant reformation. Many people thought it was their duty to reject this change which eventually plunged Europe into the Thirty Year's War (1618-1648). While many Europeans died unnecessarily (since Christ was non-violent), European culture survived. The main take away from Gutenberg's technology is this: because many more people were able to get access to books, many learned to read. Some even learned to question church authority.
    • Gutenberg's invention is usually associated with publishing materials which enabled the protestant reformation but I think this view misses the point: it also helped to publish new theories in natural philosophy (science) and mathematics when those fields began to explode ~ 150 years after Gutenberg's death in 1468.
  2. Galileo Galilei is best known for employing a telescope (a.k.a. Dutch Spy Glass) to observe four large moons of Jupiter orbit Jupiter rather than objects on Earth. The Cliffs Notes version of the story has him going up against the Vatican who clung to their belief that every heavenly body orbited Earth which was located at the center of god's universe. Galileo was found guilty in 1633 (this was an inquisition) and sentenced to live out the remaining years of his life under house arrest (he was too well known to be put to death). YOU WOULD THINK that Galileo could have better defended himself by convincing his inquisitors to look into the telescope eyepiece while it was pointed at the moons of Jupiter. In fact, he tried this but his accusers refused to look into it claiming the device was the work of the devil.
    It needs to be pointed out that without a telescope to extend human senses, it is very difficult to determine if "the sun orbits the earth" or "earth orbits the sun" (although non-telescopic measurements by Tycho Brahe come very close but you would need to be a mathematician to interpret the results). This is the beginning of an era where technology was used to extend the limits of the human senses.
  3. Within ten years of Galileo's trial, some early scientists had begun to interchange telescopic lenses to become a microscope which then facilitated the discovery of bacteria and cell biology. Once again, technology had been employed to extend human senses into a new realm.
  4. There have been countless improvements in telescopes and microscopes since those days but it would appear that humanity had hit a new limit. I say "had" because the inventions of the smaller while more powerful computers, along with computer networking via the internet, have ushered in a new age of astronomy and biology where we have put electronic eyes on our machines. For example, computers associated with the Kepler Spacecraft are able to sense planetary occultation of distant stars. This is something human eyes could never do. Similarly, computers, robotic sequencers, and the internet enabled the first two Human Genome Projects as well as the Human Proteome Project


  1. The internet actually gets its start in 1969 as ARPAnet but is never taken seriously by the world until 1989 when Tim Berners-Lee invented the World Wide Web to solve a document sharing problem at CERN. (yep, the American military invented the internet but European scientists made it useful to the world; It would appear that this renaissance began in Europe as well)
  2. Many people have already claimed that the internet (er, web) will eventually enable more people to read (on a per capita basis) than did the efforts of Johannes Gutenberg and I think this is true but we're not there just yet.
  3. The downside of the internet is that it has facilitated communications between purveyors of various conspiracy theories (like-minded weirdoes), and has recruited more people into terrorist organizations like ISIS. It has also facilitated a large crop of radio-like internet broadcasters which have further divided the population into political tribes. I believe this is why we witnessed "couldn't happen" events like BREXIT, or the election of a real estate tycoon for President of the United States. On the flipside, society might not be as divided as we think: While the political left dukes it out with the political right, we observe ~ 50% of the people not voting. This means members of the political left or right only represent 25% each.
  4. This fight between left and right seems very reminiscent of the fight between Catholic and Protestant more than four centuries ago. While some people could not imagine a non-Catholic Europe, it was the children of the people who started the Thirty Year's War who put a stop to the madness by reminding everyone that Christianity was supposed to be a religion of non-violence; not war or inquisitions resulting in new ways to torture people to death. While 2019 marks 30 years since Tim Berners-Lee invented the web, we don't need to engage in a thirty year's war to put an end to our social madness. Just as Gutenberg introduced technology which enabled the advance of science while (eventually) reducing the importance of religious dogma, I wonder if the world-wide-web will make the dogma of political extremes a thing of the past.

Epiphany 14: What's in the water in the Nordic/Scandinavia Countries?

year who Birth what original work location notes
1985 Bjarne Stroustrup Denmark C++ New Jersey was looking to add object support to the C language
1987 Andrew S. Tanenbaum American MINIX Netherlands was looking for an alternative to UNIX®
1990 Guido van Rossum Netherlands Python Netherlands was looking for a successor to BASIC
1991 Linus Torvalds Finland Linux OS Finland was working on a smaller more-efficient kernel
1995 David Axmark
Michael Widenius
MySQL Sweden was looking for an inexpensive SQL engine for PCs
1995 Rasmus Lerdorf Denmark PHP Waterloo, Canada was looking for a better CGI tool
2009 Michael Widenius Finland MariaDB Sweden was looking for an alternative to SUN-supported MySQL

Epiphany 15: Big Data renders first-past-the-post voting as irrelevant

This next section requires the reader to have some familiarity with a few terms. If you are familiar with these then jump forward
  1. Demographics and Demographic Analysis
  2. Psychographics
  3. Big Data
Facebook and Cambridge Analytica - A summary of what happened
re: American presidential election of 2016
re: BREXIT vote of 2016
  1. Facebook sold access to their customer data. Their business partners included Global Science Research and Cambridge Analytica to only name two of many
    • Facebook (and all the other social media outlets) have always done this. It is how you get to use their platform free-of-charge.
    • This was not illegal -and- all Facebook users involved "clicked through" acceptance agreements which most people never read
  2. In 2014, part-time Cambridge University researcher, Aleksandr Kogan (who was also a co-director of Global Science Research), created a voluntary questionnaire in the form of a Facebook app called This Is Your Digital Life
    • in all versions of this story, this app was used to collect your Facebook "Likes and Dislikes" as well as a list of your Facebook friends
    • in many versions of the story, this app also contained "Likes and Dislikes" of your friends (comment: since those secondary level people were never presented with an acceptance agreement, this might be illegal if not downright unethical)
    • in some versions of this story, there is mention of a 40-item politically-oriented questionnaire
  3. Aleksandr Kogan's company, Global Science Research, sold his questionnaire data to Cambridge Analytica, a data mining firm now known to be working for Trump's 2016 presidential campaign.
    • We later learned that Cambridge Analytica was also working with Canadian company AggregateIQ which was working with the Leave Campaign prior to the British BREXIT referendum
  4. The results of the voluntary questionnaire were run through a psychographic analysis which resulted in the Facebook participants being slotted into ~ 27 categories
    • I can't remember where I first saw the number "27" but any number of categories could be selected depending upon how the analysis was designed.
    • It is now believed that Cambridge Analytica collected data on 87 million Facebook users 
  5. Some of the categories identified Facebook users who...
    • never vote
    • always vote Republican
    • always vote Democratic
    • are centrists (in the political middle) -and- might be convinced to vote one way or the other
    • might vote for Hillary Clinton
  6. This last group was targeted with advertising which may have masqueraded as fake news. The intent of this targeted advertising was to tilt the result ever so slightly in the desired direction (pro-Trump or pro-BREXIT). Something like this can work in spades in any a first-past-the-post (winner-take-all) election or plebiscite. In the case of the Hillary Clinton supporters, all that was needed here was to convince her centrist supporters to "change their" vote or "stay home"
  7. Even the silicon valley big-wigs didn't see this one coming.
Comments - Observations
  1. Since all first-past-the-post elections can now be manipulated by big data technology combined with psychographics, democracies need to shift to proportional representation. Big data and psychographics will be able to affect these as well but this would be happen without a winner-take-all award.
  2. Mark Twain once said “It’s easier to fool people than to convince them they have been fooled.” There is no way you will be able to convince the winning side of the American election that there should be a do over; especially now that Trump has already been sworn in. But Britain has not yet left the EU so could engage in a do over. However, I suspect that the winning side in BREXIT are thinking politically rather than rationally.
  3. I am very worried about America's continual blaming of Russia and/or Putin. As long as Americans continue to play this blame-game, they are setting themselves up for repeat performance from their own social media corporations.

Supporting Material

Epiphany 16: Technical revolutions are (sometimes) just evolutions

When you talk to people about the Industrial Revolution most only think about one big change beginning some where between 1800 and 1900. But this is a gross oversimplification if you consider that it started with the age of steam, then transitioned to the age of electricity, then transitioned to the the information age. When we talk about technology in the information age should we begin with computers, or should we first start with getting information to people? (scrolls, books, telegraph, radio, television, cable television, internet).

The Evolution of Locomotives

Thinking about locomotives for a moment, they began by burning wood to produce steam which was used to turn the wheels. Europe quickly experienced wood shortages so locomotives switched over to coal (or any fossil fuel) with little difficulty. Now it is well known that humans burned petroleum for over 5,000 years but it wasn't until the mid-1800s that the industrial revolution commercialized petroleum extraction and refinement. Steam locomotives eventually morphed into diesel locomotives where the fuel is burned in combustion engines to directly operate pistons (eg. no intermediate steam is required). But the immense power was difficult to control via a transmission so diesel locomotives morphed into diesel electric systems where a diesel engine runs an electrical generator which is then used to power electrical motors. At this point you can see that if external electricity is available then a locomotive might be made more efficient by doing away with the diesel engine and electrical generator. It would definitely weigh a whole lot less.

The Evolution of Computers


Software Systems


  1. Improvements in either category (hardware or software) were not linear but logarithmic. Improvements in hardware and software together are multiplicative (think: logarithmic on steroids)
  2. Back in the late 1970s, a 16-bit minicomputer like the PDP-11/44 employed an 8085 to run the processor's console switches, LEDs, and serial interface. But some people in the computer industry would never get over the fact that microprocessors like the 8085 (or its descendants) would make the minicomputer processor obsolete
  3. Today I run into all kinds of people who just refuse to accept the fact that CLOUD-based systems are taking over from their favorite computer hardware and OS

Epiphany 17: What is old is new again (Python vs BASIC)

Epiphany 18: Industrial Revolutions can be disruptive

The First Industrial Age (Age of Steam)
years: 1800-1900
The Second Industrial Age (Age of Electricity)
years: 1900-2000
The Third Industrial Age (Age of Data/Information)
years: 2000-2100
comment: so when politicians claim "they will make America Great again" or that "BREXIT can bring back lost jobs" those politicians are either lying or ignorant of the facts just presented. Truth be told: Future

Epiphany 19: The dominance of C/C++ (part 3/3)

In the early days of computing, most hardware manufacturers also created, then sold, software. Putting aside operating systems for a moment, those vendors also sold programming tools including programming language compilers like Fortran (1956), ALGOL (1958) and COBOL (1959). One of the most important changes to the computing industry occurred when computer professionals got together to standardize computer languages.

One measure of any computer language's success is, I think, the number of standardizations. For example, although the BASIC language was once very popular (I am still forced to use VMS-BASIC every day by my employer in 2019) it has gone through a very small number of standardizations. This might have something to do with the fact that many BASIC implementations were so different that standardization was not possible or, perhaps, desirable.  On the other hand, languages like C, C++ and Objective-C have gone though numerous standards and continue to be improved.

For example, non-standard "C" first appeared in 1972 and now referred to as K&R C after its authors, Brian Kernighan and Denis Ritchie. Improvements were formalized during then published as C89 in 1989 by ANSI and C90 by ISO. This continued with the names C99, C11 and C18 as described here.

comment: It appears to "C" moves to a new standardization level approximately every 10 years (on average) whilst C++ moves to  a new level approximately every 3 years (on average)


Since 50% of all the digital devices on the planet "running an operating system" use some form of Linux, it should be no surprise that it is the Linux community that is pushing newer versions of C and C++

Since gcc is used to build Linux, we should look at this toolset a little closer


Epiphany 20: A few thoughts on the 50th anniversary of Apollo 11 landing on the moon

The manned spacecraft programs in both Russia and the United States changed the world in more ways than most people would ever know. First off, humanity cannot ignore the contribution of the Soviet space program because the United States would have never attempted such a thing if it were not for cold-war politics. The plain fact is this: many Americans didn't care about Apollo but did care about beating the Russians at something. History informs that President Richard Nixon and Henry Kissinger were plotting to terminate the Apollo Program at the very time that Nixon was congratulating, via telephone from the Whitehouse, Apollo 11 astronaut Neil Armstrong who was standing on moon (both Nixon and Kissinger thought that Vietnam was a more urgent issue; more on this in a moment). To add insult to injury, many Americans lost interest after the apparently routine flight of Apollo 12 (e.g. we won that race twice; time to move on). Proof of this can be seen in the number of citizens who complained to the TV networks about favorite TV programs being preempted by spaceflight coverage. Heck, in-flight video transmissions from Apollo 13 were not aired until after a tank explosion placed the mission in jeopardy. Part of this public disinterest led to cancellations of Apollo flights 18-20.

FACT: development of the Apollo Guidance Computer (AGC) was the trigger event for the largest amount of human technological progress. The design work was done by Draper Labs at MIT while manufacturing was done by Raytheon. Why was the AGC necessary? Initially, many astronauts and cosmonauts incorrectly thought that human pilots would be able to directly fly spacecraft much in the same way that pilots flew aircraft. Consider this thought experiment: you are flying the Lunar Module and need to catch-up-to, then dock with, the Command Module.  For simplicity, assume that both vehicles have identical orbits and velocities but are separated by a distance of 1000 m (3280 f). Without thinking about life outside of the atmosphere, you fire your RCS thrusters (force: 100 pounds or 444 Newtons) while aiming at the Command Module. This will increase your velocity which pushes you into a higher orbit. Your new orbital velocity is faster but your orbital time is now slower. This causes the Command Module to quickly pass under you making it impossible to dock. One correct solution dictates that you should fire your RCS thrusters away from the target vehicle, which will cause you to drop into a slightly lower orbit; then wait a short period of time; then fire your RCS thrusters in the opposite direction which should return you to the original orbit as the CM but hopefully much closer (BTW, first firing forward then quickly firing backward produces the same result). Remember "F = ma" from Isaac Newton's second law? Since "a = dV/dT" then the second law can be rewritten as "F = m x dV/dT" which becomes "F x dT = m x dV". (the left side of the equation is known as impulse). The instantaneous mass of the LM (which decreases every time you fire your thrusters) determines how long you should fire them in every maneuver (e.g. two one-second thrusts will not produce identical results; a one-second forward burn cannot (exactly) be cancelled by a one-second reverse burn). These real-time calculus solutions are best determined by a guidance computer because fuel is limited so must be conserved.

How did the development of the AGC improve things here on Earth? First off, commercial mainframe computers in the 1960s were manufactured from discrete electronic components, including individual transistors and diodes. So when IBM learned that the AGC computer had to fit into a volume the size of a bread-box (one cubic foot or 28,316 cc) many IBM engineers didn't think it was possible. The Draper/Raytheon solution employed "integrated circuits" (semiconductor chips containing numerous transistors) which they were already using in a more primitive way inside Polaris missile guidance systems. The high per-component prices meant that the American government was their primary customer (Apollo consumed 60% of the IC developed by America in 1966). Because of high cost, government contractors developed semiconductor test methods to ensure that the government would only pay for components that met design specifications. These testing methods eventually migrated from the customer (government) back to the manufacturing industry which resulted in affordable chips for the rest of us. That revolution in chip manufacturing produced things like:

comment: Apollo used 3-input NAND gates manufactured by Fairchild Semiconductor. Engineers leave to form Intel then later, AMD

Software jobs also changed drastically during this time. While it is true that high-level programming languages like FORTRAN (1957) and COBOL (1959) existed, the phrase "computer programmer" did not yet exist as computer programming was primarily done by mathematicians. High-level languages required more memory and CPU power then what was available on the AGC, but they were employed on mainframe computers used "to run AGC flight simulations" then "generate the necessary binary code" for the hand-made read-only core rope memory used to hold AGC programs. The level of redundancy built into the AGC programs (see reference-1) should inform that these people were doing "computer engineering". Click Margaret Hamilton to see what I mean.

Critics of Apollo mention that the program was too expensive in that it was consuming too much of the national budget with 4% being the oft quoted number. Let me remind everyone that cold-war concerns at the time mandated that the Defense budget was kept secret. On top of that, no American citizen knew how much money was being used to support the Vietnam War. Today we know that the total cost of Apollo (in 1968 dollars) was $25 billion whilst the cost of the Vietnam War (also in 1968 dollars) was $168 Billion dollars. Now everyone knows that it is way harder to create something than it is to destroy something so allow me to state the obvious: America got no return on the $168 billion investment. Check out the next chart then advise your political representatives accordingly:

Activity Cost Human Cost RIO
(return on investment)
Apollo Manned Spacecraft Program $25 Billion
(1968 dollars)
3 astronauts advances in metallurgy
advances in semiconductor technology
advances in computer engineering
advances in software engineering
invention of the internet
admiration of the whole world
During the peak years, the Apollo program employed approximately 400,000 scientists, engineers and technicians across 20,000 companies. Much of this work was done by, or managed by, defense contractors.

Many Americans continue to bang on about the USA being a Christian nation but I wonder if they will ever turn their spears into pruning hooks as is mentioned in Isaiah 2:3–4
Vietnam War $168 Billion
(1968 dollars)
58,000 US soldiers killed
200,000 US soldiers injured
2 million Vietnamese civilians
1.1 million North Vietnamese fighters
200,000 South Vietnamese soldiers
50,000 Civilians in Laos(up to 1973)
259,00 Civilians in Cambodia
agent orange
contempt of the whole world
During some peak years, more than 400,000 American soldiers were committed to Vietnam (almost the same number of people tied to the manned spaceflight effort).

Despite hearing crazy political justifications like "the domino theory", America lost this war but no literal or metaphorical dominos were ever observed.
First Gulf War $61 Billion 382 US military
others: casualties
American defense contractors do well first use of "depleted uranium" by the Americans
Middle-East Wars $5.9 Trillion ??? American defense contractors do well Hopefully everyone reading this knows that "1 trillion" = "1,000 billion"
First use of: Extraordinary rendition
war references:
  1. (published by: Congressional Research)
  6. comment: Since world-war-two ended in 1945, the United States of America has lost every war. But no one seems to care as long as the economy is humming along. 

What About China?

Back in the 1960s, conservatism revolved around business interests which caused some Americans to wonder if China was a missed business opportunity. This was the main reason for Nixon and Kissinger opening relations with China in 1972. IMHO this trigger event started the Chinese shift from agriculture to industrialism. (think of this as an industrial revolution confined to one country)

American companies and schools in the 1980s found themselves in the Reagan era of "minimal government" which now goes by the name austerity. This might have translated into greater economic problems, including unemployment, except for the actions of China's leader, Deng Xiaoping, who favored "maximal government" so was paying to send thousands of Chinese students to the USA every year to be educated.

I personally experienced this in 1985 Boston: we had morning lectures and afternoon labs. An English-speaking Chinese student sat one row ahead of me in the lecture hall accompanied by two minders who could not speak English but were required to pay for student slots (these minders were there to ensure the student would return to China; they passed the day in class by reading little brown books of political dogma). Back then, Americans correctly welcomed these foreign students (it was a business opportunity) but no one ever thought that China would eventually compete head-to-head with the USA. I applaud the Chinese students who were able to acquire an education in a foreign country speaking a foreign language but wonder how many Americans would be willing, or able, to do the same by travelling to China.

Steps to MAGA (Make American Great Again)

    • The colored chart (28-lines above) provides proof that many millions of people would be alive today if the USA had not  been funding foreign wars on foreign shores. One way to make American great again is to stop funneling tax payer money into those defense programs which, in realty, are offense programs used to support the American empire
    • Like an addict that is unable to quit cold-turkey, defense contractors will probably not be able to survive having their government funds reduced. But just as happened during the 1950s and 1960s, defense contractors could be employed by the government to do big science projects though civilian agencies like NASA
    • During the 1990s, the American defense budget was always under $300 billion per year. By 2010 the defense budget had climbed to $721 billion per year.  Meanwhile, NASA's budget has fluctuated between $17 to $20 billion since 2006. If NASA's budget was doubled by diverting money from the defense budget would the Pentagon even notice? And yet we know that spending on NASA will have a positive ROI (return-on-investment)
    •  DARPA is on American defense investment with a very high ROI
    • There are still many people alive today who will tell you that they attended Harvard University in the 1950s and only paid $100.00 tuition
    • Before defense contractors starting receiving the bulk of government monies, the American government used to subsidize all college and university educations. Funding for education slowly dried up as money was diverted into the defense/offense budgets.
    • Once money is diverted from offence back into education, the whole economy will repair itself

Epiphany 21: Microsoft is promoting Python3 (Python version 3.x)

2020-05-06: Today’s alert from ZNET informs that Microsoft has added 51 Python videos to their previous 44. I read somewhere that because Bill Gates had started programming in BASIC, that Microsoft could never really get rid of it (other than the massive changes to "visual studio" between VS6 and where they made it play-nice with C/C++ and anything else that generated code under the .net framework). I wonder what Bill would say about the shift to Python?

“Python for beginners” playlist from Microsoft (this is the original 44)
notes: “More Python for beginners” (this is the new 20) 

“Even More Python for Beginners” (this is the new 31)

External Links

Back to Home
Neil Rieck
Waterloo, Ontario, Canada.