Technological Change (in my life) part-2

edit: 2023-11-30 (this is a stream-of-consciousness thing)

Epiphany 10: Smart Phones are really Star Trek devices

Epiphany 11: "Moore's Law Revisited" and the 'Big Data' Revolution


Moore's Law note: The following was partly inspired by the book Thank You for Being Late (2016) by Thomas Freidman

The Chessboard

The Integrated Circuit (chip)

The Speed of Change

In response to the 1992 Russo-American moratorium on nuclear testing, the US government started ASCI (Accelerated Strategic Computing Initiative) in 1996 to provide a method for developing nuclear weapons in simulation. ASCI Red was delivered in 1997 and could execute 1.8 TeraFLOPS. It cost $55 million, was a little smaller than a tennis court, and required the equivalent power of 800 homes. In 2006 Sony released the PS3 which could also execute 1.8 TeraFLOPS at a cost $300 but only required the equivalent power of three 120W light bulbs.
Excerpt page-41 "Thank You for Being Late" (2016) Thomas Freidman

Big Data Revolution of 2006-2008

Epiphany 12: GPU's and Moore's Law

The original definition of Moore's Law stated that the number of transistors on a single chip would "double every year" while "costing the consumer the same amount".
note: after the first decade Gordon Moore revised the doubling period to every 2 years).

I remember reading articles in the mid-1990s claiming that Moore's Law would hit a limit in 2005 or 2006. The main reasons were attributed to manufacturing problems related to "the limits of photo lithography" and "gate sizes so small that electrons would be able to quantum tunnel across barriers thus rendering them conductors rather than semi-conductors"

Around 2004 Intel announced a change in direction away from "faster single CPUs" toward "single multi-core CPUs" for which they charged more money. Technically speaking, they avoided Moore's Limit by revising Moore's Law to not include the phrase "costing the consumer the same amount". So now we have an economic Moore's Law as well as a technical one.

Advances including FinFET technology, Tri-Gate technology, Gate-all-around (GAA) technology, and 3D IC Stacking have enabled the semiconductor industry to keep innovating. Most people reading this page will already be aware of the fact that the computing industry appears to be shifting from CPUs to GPUs. I was surprised to learn that graphics cards from Nvidia have beaten the technical Moore's Law for the last few product iterations.

Nvidia science cards (graphics cards without a video connector) break Moore's Law every year

Year Product Gate Size
(nm)
2010 https://en.wikipedia.org/wiki/Fermi_(microarchitecture) 40 and 28
2012 https://en.wikipedia.org/wiki/Kepler_(microarchitecture) 28
2014 https://en.wikipedia.org/wiki/Maxwell_(microarchitecture) 28
2016 https://en.wikipedia.org/wiki/Pascal_(microarchitecture) 16 and 14
2017 https://en.wikipedia.org/wiki/Volta_(microarchitecture)
https://www.theregister.co.uk/2017/05/24/deeper_dive_into_gtc17/
12

https://en.wikipedia.org/wiki/List_of_eponyms_of_Nvidia_GPU_microarchitectures

Epiphany 13: The Second Renaissance (Storm before the Calm?)

The Renaissance was a period in European history, from the 14th to the 17th century, regarded as the cultural bridge between the Middle Ages and our modern world. It needs to be mentioned that most people at the time did not know they were in a renaissance.
  1. Before Johannes Gutenberg introduced printing to Europe in 1439, it is safe to say that Europe was effectively run as a Catholic theocracy centered in Rome. But Gutenberg's technology enabled an alternative view of Christianity to flourish which we now refer to as the protestant reformation. Many people thought it was their duty to reject this change which eventually plunged Europe into the Thirty Year's War (1618-1648). While many Europeans died unnecessarily, European culture survived. The main takeaway from Gutenberg's technology is this: because many more people were able to get access to books, many learned to read. Some even learned to question church authority.
  2. Galileo Galilei is best known for employing a telescope (a.k.a. Dutch Spy Glass) to observe four large moons of Jupiter orbit Jupiter rather than objects on Earth. The Cliffs Notes version of the story has him going up against the Vatican who clung to their belief that every heavenly body orbited Earth which was located at the center of god's universe. Galileo was found guilty in 1633 (this was an inquisition) and sentenced to live out the remaining years of his life under house arrest (he was too well known to be put to death). YOU WOULD THINK that Galileo could have better defended himself by convincing his inquisitors to look into the telescope eyepiece while it was pointed at the moons of Jupiter. In fact, he tried this but his accusers refused to look into it claiming the device was the work of the devil.
    It needs to be pointed out that without a telescope to extend human senses, it is very difficult to determine if "the sun orbits the earth" or "the earth orbits the sun" (although non-telescopic measurements by Tycho Brahe came very close but you needed to be a mathematician to interpret the results). This is the beginning of an era where technology was used to extend the limits of the human senses.
  3. Within ten years of Galileo's trial, some early scientists had begun to interchange telescopic lenses to become a microscope which then facilitated the discovery of bacteria and cell biology. Once again, technology had been employed to extend human senses into a new realm.
  4. There have been countless improvements in telescopes and microscopes since those days humanity had hit a new limit. I say "had" because the inventions of the smaller while more powerful computers, along with computer networking via the internet, have ushered in a new age of astronomy and biology where we have put electronic eyes on our machines. For example, computers associated with the Kepler Spacecraft are able to sense planetary occultation of distant stars. This is something human eyes could never do. Similarly, computers, robotic sequencers, and the internet enabled the first two Human Genome Projects as well as the Human Proteome Project

Observations/Speculations

  1. The internet actually gets its start in 1969 as ARPAnet but is never taken seriously by the world until 1991 when Tim Berners-Lee invented the World Wide Web to solve a document sharing problem at CERN. (yep, the American military invented the internet but European scientists made it useful to the world; It would appear that this new renaissance began in Europe as well)
  2. Many people have already claimed that the internet (er, web) will eventually enable more people to read (on a per capita basis) than did the efforts of Johannes Gutenberg and I think this is true but we're not there just yet.
  3. The downside of the internet is that it has facilitated communications between crazy people and their conspiracy theories. It has also facilitated a large crop of radio-like internet broadcasters which further divided the population into political tribes. I believe this is why we witnessed "couldn't happen" events like BREXIT, or the election of a real estate tycoon for President of the United States. On the flip side, society might not be as divided as we think: While the political left dukes it out with the political right, we observe ~ 50% of the people not voting. This means members of the political left or right only represent 25% each.
  4. This fight between left and right seems very reminiscent of the fight between Catholic and Protestant more than four centuries ago. While some people could not imagine a non-Catholic Europe, it was the children of the people who started the Thirty Year's War who put a stop to the madness by reminding everyone that Christianity was supposed to be a religion of non-violence; not war or inquisitions resulting in new ways to torture people to death. While 2019 marks 30 years since Tim Berners-Lee invented the web, we don't need to engage in a thirty year's war to put an end to our social madness. Just as Gutenberg introduced technology which enabled the advance of science while (eventually) reducing the importance of religious dogma, I wonder if the world-wide-web will make the dogma of political extremes a thing of the past.

Epiphany 14: Something in the water of Nordic-Scandinavian countries?

year who Birth what work location notes
1985 Bjarne Stroustrup Denmark C++ New Jersey was looking to add object support to the C language
1987 Andrew S. Tanenbaum American MINIX Netherlands was looking for an alternative to UNIX®
1990 Guido van Rossum Netherlands Python Netherlands was looking for a successor to BASIC
1991 Linus Torvalds Finland Linux OS
Git
Finland
USA
was working on a smaller more-efficient kernel
a distributed version control system that tracks changes
1995 David Axmark
Michael Widenius
Sweden
Finland
MySQL Sweden was looking for an inexpensive SQL engine for PCs
1995 Rasmus Lerdorf Denmark PHP Waterloo, Canada was looking for a better CGI tool
2009 Michael Widenius Finland MariaDB Sweden was looking for an alternative to SUN-supported MySQL

Epiphany 15: Big Data renders first-past-the-post voting as irrelevant

This next section requires the reader to have some familiarity with a few terms
  1. Demographics and Demographic Analysis
  2. Psychographics
  3. Big Data
Facebook and 'Cambridge Analytica' - a summary of what happened
Wikipedia: American presidential election of 2016
Wikipedia: BREXIT vote of 2016
 
  1. Facebook sold access to their customer data. One of their business partners included Global Science Research
    • Social media outlets have always done this. It is how you get to use their platform free-of-charge.
    • This was not illegal -and- all Facebook users involved "clicked through" acceptance agreements which most people never read
  2. In 2014, Global Science Research co-director Aleksandr Kogan created a Facebook app / questionnaire called This Is Your Digital Life
    • in all versions of this story, this app was also used to collect your Facebook "Likes and Dislikes"
    • in many versions of the story, this app also collected the "Likes and Dislikes" of your FB friends
      comment: since those secondary level people were never presented with an acceptance agreement, this seem unethical if not illegal
  3. Aleksandr Kogan's company, Global Science Research, sold his questionnaire data to Cambridge Analytica, a data mining firm:
    • who sold information to AggregateIQ which was hired by the Leave Campaign prior to the British BREXIT referendum
    • now known to be working for the 2016 political campaigns of Ted Cruz and Donald Trump
  4. The results of the voluntary questionnaire were run through a psychographic analysis which resulted in the Facebook participants being slotted into 27 categories
    • It is now believed that Cambridge Analytica collected data on 87 million Facebook users
  5. Some of the categories identified Facebook people who...
    • never vote
    • always vote Republican (or conservative)
    • always vote Democratic (or liberal)
    • are political centrists -and- who might be convinced to vote one way or the other
  6. This last group was targeted with advertising (which may have masqueraded as fake news) with the intent of convincing some to vote in the desired direction (pro-Trump or pro-BREXIT) or just stay home. This work well in any a first-past-the-post (winner-take-all) election or plebiscite.
Comments - Observations
  1. Since all first-past-the-post elections can now be manipulated by big data technology combined with psychographics, democracies need to shift to proportional representation.
  2. Mark Twain once said “It’s easier to fool people than to convince them they have been fooled.”
  3. I am very worried about America's continual blaming of Russia and/or Putin. As long as Americans do this they will be blind to the effect of social media in the next election

Supporting Material

Epiphany 16: Industrial/Technical Evolution (part-1)

Many associate Industrial Revolution with the years 1760-1860 but this might be an oversimplification if you consider that it began with the age of steam, then transitioned to the age of electricity, then transitioned to the age of information. Many associate the information age with computers but his misses the point that information with earlier electrical technologies (e.g. telegraph, telephone, radio, television, cable television, internet) or non-electrical technologies (e.g. scrolls, books, newspapers)

The Evolution of Locomotives

steam (wood + coal) >> petroleum >> electricity

Thinking about locomotives for a moment, they began by burning wood to produce steam which was used to turn wheels. Europe quickly experienced wood shortages so locomotives switched over to coal (or any fossil fuel) with little difficulty. It is now known that humans burned petroleum for over 5,000 years but it wasn't until the mid-1800s that the industrial revolution commercialized petroleum extraction and refinement.

Steam locomotives eventually morphed into diesel locomotives where the fuel is burned in combustion engines to directly operate pistons (e.g. no intermediate steam is required). But the immense power was difficult to control via a mechanical transmission so diesel locomotives morphed into diesel electric systems where "a diesel engine runs an electrical generator" which is then used to power electrical motors.

At this point you can see that if an external electricity source is available then a locomotive might be made much more efficient (lighter) by doing away with the diesel engine and electrical generator.

The Evolution of Computers

Hardware

Software Systems

Observations:

  1. Improvements in either category (hardware or software) were not linear but logarithmic. Improvements in hardware and software together are multiplicative (think: logarithmic on steroids)
  2. Back in the late 1970s, a 16-bit minicomputer like the PDP-11/44 employed an 8085 to run the processor's console switches, LEDs, and serial interface. But some people in the computer industry would never get over the fact that microprocessors like the 8085 (or its descendants) would make the minicomputer processor obsolete
  3. Today I run into all kinds of people who just refuse to accept the fact that CLOUD-based systems are taking over from their favorite computer hardware and OS

Epiphany 17: What is old is new again (Python vs BASIC)

#!/bin/python3
# author : Neil Rieck
# created: 2019-08-22
# purpose: demo to show that Python is better than BASIC
# (and most other languages) for serious work and/or noodling # ----------------------------------------------------------- import math # math library # print("pi :",format(math.pi,'.48f')) print("1/3 :",format(1/3,'48f')) print("2^32 :",2**31) # 32-bit signed int (BASIC fails) print("2^64 :",2**63) # 64-bit signed int (BASIC fails) print("2^128 :",2**128) # difficult in many languages print("2^256 :",2**256) # '' print("2^512 :",2**512) # '' print("2^9999 :",2**9999) # '' print("2^99999:",2**99999) # works in Python
OUTPUT

pi     : 3.141592653589793115997963468544185161590576171875
1/3    :                                         0.333333
2^32   : 4294967296
2^64   : 18446744073709551616
2^128  : 340282366920938463463374607431768211456
2^256  : 11579208923731619542357098500868790785326998466...
2^512  : 13407807929942597099574024998205846127479365820...
2^9999 : 99753155844037919244187108134179254191174841594...

Note: "..." is where I chopped the line for this display





Epiphany 18: Industrial/Technical Evolutions (part-2)

Food-for-thought:

The First Industrial Age (Age of Steam) years: 1760-1860

The Second Industrial Age (Age of Electricity) years: 1860-1960

The Third Industrial Age (Age of Digital Data / Age of Information) years: 1960-2010?

comment: so when politicians claim "they will make America Great again" or that "BREXIT can bring back lost jobs" those politicians are either lying or ignorant of the facts just presented. Truth be told:

Speculation: The Fourth Industrial Age (Artificial Intelligence, deep learning, and more?) years: 2010-now

Epiphany 19: The dominance of C/C++ (part 3/3)

In the early days of computing, most hardware manufacturers also created, then sold, software. Putting aside operating systems for a moment, those vendors also sold programming tools including programming language compilers like Fortran (1956), ALGOL (1958) and COBOL (1959). One of the most important changes to the computing industry occurred when computer professionals got together to standardize computer languages.

One measure of any computer language's success is, I think, the number of standardizations. For example, although the BASIC language was once very popular (I am still forced to use VMS-BASIC every day by my employer in 2019) it has gone through a very small number of standardizations. This might have something to do with the fact that many BASIC implementations were so different that standardization was not possible or, perhaps, desirable.  On the other hand, languages like C, C++ and Objective-C have gone though numerous standards and continue to be improved.

For example, non-standard "C" first appeared in 1972 and now referred to as K&R C after its authors, Brian Kernighan and Denis Ritchie. Improvements were formalized then published as C89 in 1989 by ANSI and C90 by ISO. This continued with the names C99, C11 and C18 as described here.

comment: It appears to "C" moves to a new standardization level approximately every 10 years (on average) whilst C++ moves to  a new level approximately every 3 years (on average)

Linux

Since 50% of all the digital devices on the planet "running an operating system" use some form of Linux, it should be no surprise that it is the Linux community that is pushing newer versions of C and C++

Since gcc is used to build Linux, we should look at this toolset a little closer

Oddities

Make(s)

One thing usually overlooked in this ecosystem begins with the Unix Make utility first written in 1977 by Stuart Feldman of Bell Labs. This was the easiest method to get an application running on another Unix implementation which may be running on a completely foreign computer architecture. I'm jumping a lot of steps here but after a time, SUN Microsystems thought that Make was getting a little long-in-the-tooth so developed a product called "Solaris Package Manager". It should surprise no one that Red Hat needed something in 1995 to move to Red Hat Linux 2 so further developed the SUN product calling it "Red hat Package Manager" (file extensions are '.rpm')


Back to Home
Neil Rieck
Waterloo, Ontario, Canada.